How to use real database in Power BI and make it refresh dynamicaly? - powerbi

I'm new to Power BI (Free Version) and I have been asked to develop a report system which generates report from an excel sheet, the reports work good for only the data I have collected.
but my question is how to connect to the data immediately from SQL server without the need to convert it to excel and then import it in power BI, I also want the data to be refreshed dynamically.
One of the solutions I tried is to add new dataset but I get the following message:
Refresh can't be scheduled because the data set doesn't contain any
data model connections, or is a worksheet or linked table. To schedule
refresh, the data must be loaded into the data model.
I have looked for many solutions but none has worked.
am I missing a concept? thank you

If this data is stored in a SQL SERVER table it is a pretty straight forward process.
When you create a new power bi report (.Pbix) you should see a prompt asking you if you want to "Get Data". You would select the 'SQL Server Database' option - See the image below:
Then, you will be asked to enter the Server and Database name, and to specify either 'Import' or 'Direct Query' mode. If you choose 'Import' the data will be refreshed every time you access the report or upon 'Refresh' within a report session. If you choose the latter, the connection will always be live i.e. any changes to the data in your database will be reflected in the report.
Once you get passed this window, you will be asked to either specify credentials or use a windows authentication to access the database and server. After that you can either specify a query to pull in some data or you can select from a list of tables.
I hope this helps!!

Related

Optimize data load from Azure Cosmos DB to Power BI

Currently we have a problem with loading data when updating the report data with respect to the DB, since it has too many records and it takes forever to load all the data. The issue is how can I load only the data from the last year to avoid taking so long to load everything. As I see, trying to connect to the COSMO DB in the box allows me to place an SQL query, but I don't know how to do it in this type of non-relational database.
Example
Power BI has an incremental refresh feature. You should be able to refresh the current year only.
If that still doesn’t meet expectations I would look at a preview feature called Azure Synapse Link which automatically pulls all Cosmos DB updates out into analytical storage you can query much faster in Azure Synapse Analytics in order to refresh Power BI faster.
Depending on the volume of the data you will hit a number of issues. First is you may exceed your RU limit, slowing down the extraction of the data from CosmosDB. The second issue will be the transforming of the data from JSON format to a structured format.
I would try to write a query to specify the fields and items that you need. That will reduce the time of processing and getting the data.
For SQL queries it will be some thing like
SELECT * FROM c WHERE c.partitionEntity = 'guid'
For more information on the CosmosDB SQL API syntax please see here to get you started.
You can use the query window in Azure to run the SQL commands, or Azure Storage Explorer to test the query, then move it to Power BI.
What is highly recommended is to extract the data into a place where is can be transformed into a strcutured format like a table or csv file.
For example use Azure Databricks to extract, then turn the JSON format into a table formatted object.
You do have the option of using running Databricks notebook queries in CosmosDB, or Azure DataBricks in its own instance. One other option would to use change feed to send the data and an Azure Function to send and shred the data to Blob Storage and query it from there, using Power BI, DataBricks, Azure SQL Database etc.
In the Source of your Query, you can make a select based on the CosmosDB _ts system property, like:
Query ="SELECT * FROM XYZ AS t WHERE t._ts > 1609455599"
In this case, 1609455599 is the timestamp which corresponds to 31.12.2020, 23:59:59. So, only data from 2021 will be selected.

PowerBI Dataset and Report replacement

So, I created dataset that can be used to generate multiple reports and dashboards.
I would like to publish to the web portal and allow report designers to create their own reports and dashboards. Which works great until I need to update the model. If I update the model and push the data set via Publish function it deletes the reports.
So is the only option for everyone to use PowerBI Desktop? What is the web-based report designer for? Is this only for datasets that will never change?
If your Power BI file has reports in it, it will over write them. What I think is happening is that your users are updating the report in your dataset, for example adding a new tab, so when you load changes it overwrites them.
You need to decouple the Dataset from the reports. You can load your dataset with out any reports. You can then build reports using two methods.
1) Connect to the dataset using Power BI Desktop, using the connect to Power BI Dataset connection. That links to the dataset in a Live Connection, so you can build your reports from that. You can then publish your report. Any changes to the dataset will not overwrite it.
2) For the Web builder you can do the same, create a new report in the portal and select a dataset from that workspace to build your report. What the user needs to do then is 'Save as' and give that report as a new name.
Both methods just link to your dataset, not build a report directly in the file

Reuse a previously published datasource in a Power BI report

I have developed a Power BI report using Power BI Desktop, pointing to a private on premise development database as the datasource so that I was able to develop and test it easily. Then, I published it from my Power BI Desktop pbix to the work area of my customer.
As a result, the work area contains the published report and the dataset. Later, my customer has changed the dataset so that it now points to the correct on premise production database of their own. It works perfectly.
Now, I want to publish a new report for my customer using the previously published and reconfigured dataset. The problem is that I can't see any option in Power BI Desktop to have the report point to the published dataset, nor I can't see any option to avoid creating a new dataset each time I publish a report, nor any way to reconfigure from the web portal the new published report to point to the same dataset as the first one.
Is there any way to do this or any work around for this scenario? I think the most reasonable solution would be to be able to change the dataset of any report, so that the datasets of any report could be interchangeable.
Update:
I had already used connection specific parameters, but I'm not given rights to change the published dataset, so thats a dead end.
Another thing I have come up to is that in Power BI Desktop you cannot change the connection parameters values to those of production enviroment and publish the report if you can't access the target database from your computer, because PowerBI Desktop ask you to apply changes first, and when it tries to apply the values it tries to connect to the corresponding database and, obviously, ends with a network related error or timeout error trying to connect to the database server, therefore cancelling changes and returning to the starting point.
It's always a good practice to use connection specific parameters to define the data source. This means that you do not enter server name directly, but specify it indirectly using a parameter. The same for the database name, if applicable.
If you are about to make a new report, cancel Get data dialog, define parameters as described bellow, and then in Get data specify the datasource using these parameters:
To modify an existing report, open Power Query Editor by clicking Edit Queries and in Manage Parameters define two new text parameters, lets name them ServerName and DatabaseName:
Set their current values to point to one of your data sources, e.g. SQLSERVER2016 and AdventureWorks2016. Then right click your query in the report and open Advanced Editor. Find the server name and database name in the M code:
and replace them with the parameters defined above, so the M code will look like this:
Now you can close and apply changes and your report should work as before. But now when you want to change the data source, do it using Edit Parameters:
and change the server and/or database name to point to the other data source, that you want to use for your report:
After changing parameter values, Power BI Desktop will ask you to apply the changes and reload the data from the new data source. To change the parameter values (i.e. the data source) of a report published in Power BI Service, go to dataset's settings and enter new server and/or database name:
If the server is on-premise, check the Gateway connection too, to make sure that it is configured properly to use the right gateway. You may also want to check the available gateways in Manage gateways:
After changing the data source, refresh your dataset to get the data from the new data source. With Power BI Pro account you can do this 8 times per 24 hours, while if the dataset is in a dedicated capacity, this limit is raised to 48 times per 24 hours.
This is a easy way to make your reports "switchable", e.g. for switching one report from DEV or QA to PROD environment, or as part of your disaster recovery plan, to automate switching all reports in some workgroup to another DR server. In your case, this will allow you (or your customers) to easily switch the datasource of the report.
I think the only correct answer is that it cannot be done, at least at this moment.
The most closest way of achieving this is with Live connections:
https://learn.microsoft.com/en-us/power-bi/desktop-report-lifecycle-datasets
But if you have already designed your report without using the Live connection but your own development enviroment and corresponding connection parameters then you are lost, your only chance is redo all your report with the Live Connection, or the queerest one solution, to use an alias in your configuration matching the name of the database server and the same database name that in the target production environment.

PowerBI - Automatic Refresh OneDrive Business From Excel File

I have an excel sheet stored in a OneDrive Business Folder, which is updated continuously (approximately every minute). I am trying to show a live count of the number of entries in the table, as below, on a powerBI report:
From here I have tried two options:
1.Created a PowerBI Desktop File which shows the total count on a single card. I have then published this to PowerBI Service as a report.
2.Imported the excel file in via "Get Data->Files->OneDrive-Business" on PBI Service:
I loaded in the data and then created a report as below:
However, when a change is made to the excel file on the onedrive, the report data does not update automatically when any change is made. Instead, the only way that it updates is via the "refresh now" option in datasets on PBI service:
Then once that is refreshed, I have to manually refresh the data in the report window also.
The connection between the report and the excel file is therefore available to allow the manual update, but for some reason does not automatically update when I make a change. Are there are solutions available to get this to update automatically.
I saw this cool idea, but I can't seem to get it to work:
https://bigintsolutions.com/2019/03/29/refresh-power-bi-report-every-min-and-show-on-a-tv/
I have also read many threads, with some people having the same issue I am having:
https://community.powerbi.com/t5/Power-Query/Automatic-Refresh-not-working-when-connecting-to-SharePoint/td-p/546308
I know that there should be an update every hour for PowerBI-OneDrive connections, but I was hoping there was a way for updating live.
Goal: PowerBI Service Report to update automatically, for live data feed to a TV screen, when a change is made, say every minute.
Any help would be greatly appreciated!
Where are you sourcing your data? Updating each minute to an Excel file seems like something that won’t scale in the long term. This sounds like a better scenario for a real-time dataset that is in hybrid mode. Then you could build a dashboard over the data and it would automatically update as the data updates. I’ve used Power Automate to push data into a real-time dataset as well.
Treb Gatte, Power BI MVP

How to access single tenant Azure Analsysis Server with Power BI Embedded

So, currently I'm having difficulty understanding how Power BI Embedded can be setup so that each customer can access data from their own separate Azure Analysis Service, this is an App Owns Data situation. Analysis Services will be running in In-Memory mode and it will be accessed from Power BI via Live Connect.
Ideally I would like the Power BI Report to be ignorant of the data set/data source until the embedded report is provided with a parameter (e.g. connection string) which the report interprets so that it knows which server to connect to. So, ideally have: one Workspace, one Report, and zero (or a fake) Dataset.
The following is roughly what I'm looking to do (note the Red and Blue flow access a different server):
It looks like if I created both a Report and Dataset per customer I can achieve my goal but this seems like a poor approach since if the Report needs to be updated this involves updating, potentially, hundreds of reports. Also creating hundreds of Reports seems like unnecessary overhead when all Power BI needs to change for each request is the connection string pointing to the data source.
So is it possible to share the Workspace and Report across all customers but having completely separate data sources? Or is my approach in conflict with the way Power BI expects to function?
To date, I've tried using Query Parameters when configuring the data source in Power BI Desktop but I get the following error:
The connect live option for this file is disabled because it already contains data from another data source. You cannot explore live data and connect to another type of data source in the same file.
Please note,
Every report in Power BI can be connected to only one Dataset.
There is NO ability to dynamically change a connection string on the fly.
Currently, and in the foreseeable future, you'd have to clone the report & dataset per customer (or per connection setup) and modify the new dataset's connection string to match.
You can then dynamically choose which report to display based on your customer's needs.
Cloning a report can be done using:
POST https://api.powerbi.com/v1.0/myorg/reports/{report_id}/Clone
POST https://api.powerbi.com/v1.0/myorg/groups/{group_id}/reports/{report_id}/Clone
https://msdn.microsoft.com/en-us/library/mt784674.aspx
Changing the connection string would be done using:
POST https://api.powerbi.com/v1.0/myorg/datasets/{dataset_id}/Default.SetAllConnections
(similar API for groups)
https://msdn.microsoft.com/en-us/library/mt748181.aspx
using the C#.NET library provided by Power BI team, you'd use
Reports.CloneReport(string reportKey, CloneReportRequest requestParameters)
Datasets.SetAllDatasetConnections(string datasetKey, ConnectionDetails parameters)