Power BI Desktop with the Azure Data Lake Store? - powerbi

I am connecting the Power BI to Azure Data Lake store with multiple files representing multiple tables.
1)
update It is currently loading the data into the Power BI file.update
But, can I have live connection from Power BI Desktop with the Azure Data Lake Store?
2)
Can I load multiple files to represent Dimensions and Fact tables.

From desktop you can access the Azure Data Lake Store data source - just make sure you're using a recent version of Power BI Desktop Data Lake Store - Power BI.
You can join multiple queries together in Power BI desktop

The documentation for PowerBI does not list Azure Data Lake Store as a source that can be connected live as of May 2018: https://learn.microsoft.com/en-us/power-bi/refresh-data#live-connections-and-directquery-to-on-premises-data-sources.
Alternatively, you could try using Azure Stream Analytics to create a job that can copy data, and connect to the live stream, but that process might need to be manually triggered and requires data movement, which might not be ideal for your scenario. https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-stream-analytics
Hope this helps.

Yes, you can get data from Azure Data Lake Store from Power BI desktop application:
Also, you can join multiple queries from different tables into PBI desktop app.

Related

How to connect to POWER BI DATSET in Azure data factory

We have a requirement where we have to fetch data from Power BI dataset and transform the data and then produce the data again in Power BI Workspace so that I may consume that in PBI.
-> Is there any way I may connect my Power BI Dataset in ADF.
-> Once the data is fetched can I use Power Query to work on dataset in ADF.
-> Where to store that data: in ADLS or in Azure BLOB or may I save this data in SQL directly?
-> Once I have transformed the data can i publish my gold data in Power BI Workspace?
There is PowerBI REST API to execute queries:
Datasets – Execute Queries
you can access the power bi dataset via managed identity or service principal authentication.
The below blog :
https://datasharkx.wordpress.com/2022/11/03/copy-data-from-power-bi-through-azure-data-factory-synapse-pipeline-via-managed-identity-authentication-part-4/
You can leverage Dataflows to do the transformation
The query w.r.t 3rd aspect really depends on your architecture.You can preserve it anywhere

How to connect streaming data and display in Power BI

I am working on a power bi report which is embedded in a portal. I need to display live GPS live locations of vehicles in a MAP based on a company that user has access to in the portal. basically every user will have a company associated to them and I'll use rowlevelsecurity for doing this.
A third party application will give access to live streaming data through a API. Is it possible to connect to a third party application API from power bi and show gps locations? If it is possible, how can I do it? I am totally new to this. please help me with any documents/scripts. We have power bi pro license.
For your use case you will not be able to do it all in Power BI, you'll need to push the streaming data to some sort of database, Azure SQL Server/Azure Cosmos DB/Azure Databricks using Event Hubs and maybe Streaming Analytics as well. So for example your stream sends the data to the event hub endpoint, then sends it to a database, then Power BI reads it.
You can then link a Power BI report to that database and set it up in Direct Query mode, and then set the page refresh on the report. Depending on the type of service your period of refresh for none Embedded/Premium workspaces will be 30mins or more, and not 'realtime'

How to deploy Power BI reports and connect them to a single Power BI Dataset

As far as I know, deploying a Power BI report from Power BI Desktop results in two items, the report itself and the dataset. When deploying a new report using the same dataset, will deploy the new report and a second copy of the same dataset in Power BI Service. That is not what I wanted. To not confuse end users and other, I want only an unique dataset deployed.
I want to make use of Azure Devops deploying to Power BI Service in a Dev, Test and Prod way. The dataset will be an azure analysis services data model, but the principle should be the same. I need to reduce the dataset to be exactly one and all reports must relate to that data model. I have heard of a Rest API or powershell scripting that can come to a rescue here.
So if any of you have done this or know of a good article that describes how to do this, I would be grateful.
Regards Geir
The best option is to separate the Power BI report in the frontend and the backend. You create a file purely for the dataset if you are importing, no reports created on it. You can then create the reports, using the service connection to the dataset, or with Power BI desktop, in the connection to Power BI Dataset option. Both will use 'Live Connection' mode, so you cannot add any other data sources to the model, for example bring in a CSV file or SQL database.
If you are connecting to an Azure Analysis Service data model, you can use this approach, however as it is only a connection only, not a full fat dataset, it should not be an issue to have copies of the dataset, as it is just the connection. Having copies of the dataset is only an issue if you are importing data, then it is best to move things to data flows, and use the same front/back end method, and the planning around the scheduling of the dataflows then datasets
You can use the REST API to move reports and the datasets that they connect to, and move items to new workspaces. If you have Power BI Premium that has a life cycle tool to move items between dev/test/live workspaces
If you create a report in desktop and choose 'Power BI Dataset' as live connection to work over it - when you upload the report to the same workspace, it will only upload the report and connect to the same dataset
https://radacad.com/power-bi-shared-datasets-what-is-it-how-does-it-work-and-why-should-you-care#:~:text=A%20shared%20dataset%20is%20a%20dataset%20that%20shared%20between%20multiple,tenant%20in%20Power%20BI%20environment.

Power BI + Snowflake SSO

Snowflake
PowerBI
Is it possible to use different warehouse in snowflake for data refresh in Power BI than warehouse used for dashboard development?
I'd like to have separate warehouse for data refresh in order to be able to track snowflake expenses.
Let's say I'm using SF_TUTS_WH as a warehouse for development, but I'd like to use PBI_WH as warehouse for data refresh in PowerBI. Is it possible to achieve this or do I have to change snowflake warehouse in PowerBI file before I'll load it to PBI online?
FYI: https://www.snowflake.com/blog/using-sso-between-power-bi-and-snowflake/
One option which will work for you is to have the ware house name paramterized for the reports. The value of the parameter can be updated from the service based on the requirements.Please check out the below video link to see how the parameters can be updated in the power bi service.
https://youtu.be/MJVSu47iMmo
Hope this helps
Thanks ,
Prasanth

Combine several pbix models into a single repository

I have a collection of .pbix models that follow a similar structure, ie, have the same tables and relationships.
It is too complex to combine them all into a single .pbix.
Is there a way to upload all these tables into a single repository, like PBI Service dataflows or a data warehouse, or something similar.
And then get the data back to PBI Desktop and perform DAX calculations, visualizations and report.
Any suggestions/ ideas?
Thank you so much for helping!
You can publish them to Power BI Service, and then create separate reports, but using these published datasets as a data source.
See Connect to datasets in the Power BI service from Power BI Desktop.
After publishing your "model" reports to Power BI Online, start making a new blank report, but instead of getting the data from files/databases/etc., choose Power BI service as a data source and select the previously published dataset. After that, you can publish your report the same way, but in this case you can share one dataset (your model) between multiple reports.