PowerBI – How to set up a streaming data report? - powerbi

I have a range of devices that push data to an Azure IoT hub. I then have a stream analytics job which ingests the data from IoThub and outputs it to PowerBI. In PowerBI then I have created a report from the dataset which is working. I now want that report to be ‘live’ using the streaming data from Stream Analytics, but can’t see how to make it so?
Looking at this guide, under the section ‘Create and publish a Power BI report to visualize the data’ point 4 says ‘Click Streaming Datasets’ which sounds like exactly what I need, but I don’t see that option anywhere?
Have I missed a step or set something up wrong, or is there another way to make my report ‘live’ without needing manual refresh and publishing?
Here's what I see when creating a report:
Thanks

Go to the report where you want to add the streaming data tile to.
Click on '+ Add tile' over on the top navigation
You should see an option for Real time - Custom streaming data on the resulting blade.

Related

Labeling action on data point in data analytics dashboard

We would like to create a data analytics dashboard (using tools such as PowerBI, Kibana or similar).
In a visualization we would like to give an user an addition opportunity in the front end to add additional information to the visualized data point - e.g. right click on data point, check box that this value is now confirmed by expert and then store this info back to database.
Is this usecase already available in some tool/plug in?

Power BI :: Some credentials aren't validated as they are set to skip test connection

My problem is very easy: I'm retrieving all stocks about lithium on the ASX market from this web page.
I created a Power BI dashboard that you can visualize here.
As the source is Web I would like to refresh it every day using the Gateway connection, so I have successfully installed it on my computer that I turn on every day:
The problem is that instead of a successful daily refresh I have a daily error:
Failed to test the connection to your data source. Please retry your credentials.
Some credentials aren't validated as they are set to skip test connection.
If I enter in the Edit credentials I see the error:
Failed to update data source credentials: The credentials provided for the Web source are invalid. (Source at https://www.marketindex.com.au/commodities/lithium.)Show details
This because I'm using Authentication method: Anonymous and Privacy level setting for this data source: Public. But this is exactly what I'm using on Power BI desktop and it's working!
What am I doing wrong here?
Why the same connection is working fine on Power BI Desktop but not the same dashboard uploaded online?
EDIT: the On-premises data gateway looks installed without problems
It seems like the issues you're experiencing may be caused by the link you're using (https://www.marketindex.com.au/commodities/lithium), although I'm not sure exactly why.
For example, as has been mentioned in the Do I Need a Gateway for Web post, one can use Web.Contents instead of Web.BrowserContents in the raw M code in the Power Query Editor to avoid using a gateway. However, when I test this implementation using the link that you provide, I get an error when refreshing from Power BI online.
If I test this implementation using an HTML table from the https://tradingeconomics.com/commodity/lithium website, though, and set the data source credentials as follows, then I am able to successfully refresh the dataset from Power BI online.
I tested this with another HTML table from a random webpage and an online refresh worked as well. So it seems like there is something odd about how Power BI online interacts with the Market Index website and this is causing your issue. Potentially because of some authentication configuration on the Market Index side of things? I'm not able to help with looking further into why this is occurring, though.
Would you be able to use a different website with similar information as a data source for your report? If so, you could avoid the issues you're experiencing and avoid using a gateway in general if you wanted.
If you go this route, it seems like using the included "Scheduled refresh" setup in Power BI online works, but I will include the Power Automate scheduled refresh setup below in case this is desired instead.
Power Automate Scheduled Refresh
One can use Power Automate to "schedule" dataset refreshes instead of the included "Scheduled refresh" functionality in Power BI online. The steps are a little too long to include here, but the Refresh your Power BI dataset using Microsoft Flow article outlines the process on setting this up. There are various triggers that you can use to trigger your flow. The linked article uses one that triggers based on a SharePoint action, but there is a Recurrence trigger that can be used to trigger your flow at regular intervals/times.
Just set up your flow, make sure that it is "turned on", and your dataset will refresh at regular intervals from the cloud.

Optimize data load from Azure Cosmos DB to Power BI

Currently we have a problem with loading data when updating the report data with respect to the DB, since it has too many records and it takes forever to load all the data. The issue is how can I load only the data from the last year to avoid taking so long to load everything. As I see, trying to connect to the COSMO DB in the box allows me to place an SQL query, but I don't know how to do it in this type of non-relational database.
Example
Power BI has an incremental refresh feature. You should be able to refresh the current year only.
If that still doesn’t meet expectations I would look at a preview feature called Azure Synapse Link which automatically pulls all Cosmos DB updates out into analytical storage you can query much faster in Azure Synapse Analytics in order to refresh Power BI faster.
Depending on the volume of the data you will hit a number of issues. First is you may exceed your RU limit, slowing down the extraction of the data from CosmosDB. The second issue will be the transforming of the data from JSON format to a structured format.
I would try to write a query to specify the fields and items that you need. That will reduce the time of processing and getting the data.
For SQL queries it will be some thing like
SELECT * FROM c WHERE c.partitionEntity = 'guid'
For more information on the CosmosDB SQL API syntax please see here to get you started.
You can use the query window in Azure to run the SQL commands, or Azure Storage Explorer to test the query, then move it to Power BI.
What is highly recommended is to extract the data into a place where is can be transformed into a strcutured format like a table or csv file.
For example use Azure Databricks to extract, then turn the JSON format into a table formatted object.
You do have the option of using running Databricks notebook queries in CosmosDB, or Azure DataBricks in its own instance. One other option would to use change feed to send the data and an Azure Function to send and shred the data to Blob Storage and query it from there, using Power BI, DataBricks, Azure SQL Database etc.
In the Source of your Query, you can make a select based on the CosmosDB _ts system property, like:
Query ="SELECT * FROM XYZ AS t WHERE t._ts > 1609455599"
In this case, 1609455599 is the timestamp which corresponds to 31.12.2020, 23:59:59. So, only data from 2021 will be selected.

PowerBI - Automatic Refresh OneDrive Business From Excel File

I have an excel sheet stored in a OneDrive Business Folder, which is updated continuously (approximately every minute). I am trying to show a live count of the number of entries in the table, as below, on a powerBI report:
From here I have tried two options:
1.Created a PowerBI Desktop File which shows the total count on a single card. I have then published this to PowerBI Service as a report.
2.Imported the excel file in via "Get Data->Files->OneDrive-Business" on PBI Service:
I loaded in the data and then created a report as below:
However, when a change is made to the excel file on the onedrive, the report data does not update automatically when any change is made. Instead, the only way that it updates is via the "refresh now" option in datasets on PBI service:
Then once that is refreshed, I have to manually refresh the data in the report window also.
The connection between the report and the excel file is therefore available to allow the manual update, but for some reason does not automatically update when I make a change. Are there are solutions available to get this to update automatically.
I saw this cool idea, but I can't seem to get it to work:
https://bigintsolutions.com/2019/03/29/refresh-power-bi-report-every-min-and-show-on-a-tv/
I have also read many threads, with some people having the same issue I am having:
https://community.powerbi.com/t5/Power-Query/Automatic-Refresh-not-working-when-connecting-to-SharePoint/td-p/546308
I know that there should be an update every hour for PowerBI-OneDrive connections, but I was hoping there was a way for updating live.
Goal: PowerBI Service Report to update automatically, for live data feed to a TV screen, when a change is made, say every minute.
Any help would be greatly appreciated!
Where are you sourcing your data? Updating each minute to an Excel file seems like something that won’t scale in the long term. This sounds like a better scenario for a real-time dataset that is in hybrid mode. Then you could build a dashboard over the data and it would automatically update as the data updates. I’ve used Power Automate to push data into a real-time dataset as well.
Treb Gatte, Power BI MVP

How to update data in google cloud storage/bigquery for google data studio?

For context, we would like to visualize our data in google data studio - this dataset receives more entries each week. I have tried hosting our data sets in google drive, but it seems that they're too large and this slows down google data studio (the file is only 50 mb, am I doing something wrong?).
I have loaded our data into google cloud storage --> google bigquery, and connected my google data studio to my bigquery table. This has allowed me to use the google data studio dashboard much quicker!
I'm not sure what is the best way to update our data weekly in google cloud/bigquery. I have found a slow way to do this by uploading the new weekly data to google cloud, then appending the data to my table manually in bigquery, but I'm wondering if there's a better way to do this (or at least a more automated way)?
I'm open to any suggestions, and if you think that bigquery/google cloud storage is not the answer for me, please let me know!
If I understand your question correctly, you want to automate the query that populate your table, which is connected to Data Studio.
If this is the case, then you can use Scheduled Query from BigQuery. Scheduled query allow you to define a query which results can be inserted in a new table. Particularly you can specify different rules for repetition (minimum each 15 minutes) and execution, as well as destination writing options (destination table, writing mode: append, truncate).
In order to use Scheduled Queries your account must have the right permissions. You can have a look at the following documentation to better understand how to use Scheduled Query [1].
Also, please note that at the front end the updated data in the BigQuery table will be seen updated in Datastudio at each refresh (click on refresh button in Datastudio). To automatically refresh the front-end visualization you can use the following plugin [2] or automate the click on the refresh button through Browser console commands.
[1] https://cloud.google.com/bigquery/docs/scheduling-queries
[2] https://chrome.google.com/webstore/detail/data-studio-auto-refresh/inkgahcdacjcejipadnndepfllmbgoag?hl=en