Connect Power BI to BigQuery by JSON key - google-cloud-platform

I have a Google Cloud JSON key (eg. z-dexxent-3X5X02-2X5X287f780X.json) and I'd like to connect Power BI to some specific project/dataset/table in BigQuery using this JSON file.
Please, is there some step by step for this?
Thanks in advance!

I might be missing something, if your data is already in big query would use the built in Power BI Google bigquery connector
https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-connect-bigquery
and if it is not yet in bigquery follow these steps
https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-json

Related

How to refresh the dataset in Power BI which was imported from a Power BI (M-Query)

I am trying to visualize the logs dataset in Power BI which I got from Aure Logs with the help of Kusto Query Language (KQL). I have exported the output of the logs to an M-Query and with the help of that M-Query, I have imported the dataset to Power BI.
Now I want to refresh my dataset on the interval of 2 hours. How can I do it without manual intervention? Please help me. Thank you
You simply Configure Scheduled Refresh in the Power BI Service.
You can also query the logs in real-time using Azure Data Explorer to query Log Analytics as a DirectQuery source.

Importing notebook from Databricks into PowerBI

Is there easy way to import results of sql written in databricks notebook directly into powerbi ?
You can use the Partner Connect feature from Databricks. In a few clicks, it will help you to connect Delta tables with Power BI.
If that doesn't help then, export the data as CSV and use that as source for PowerBI

How to make sure that my source tables are being refreshed from Google cloud which are being used in power bi

I have to create a power bi dashboard in which data will come from Google cloud. Out of the two data sources, I want one of my tables to be refreshed every Tuesday:meaning the table created in google cloud should be refreshed weekly so that it reflects in power bi with that refresh . I am new to both Google cloud as well as power bi. Can someone help me achieve this requirement.
Thanks in advance.

PowerBI - supports parquet format from adls gen1

Need to know whether Power BI supports parquet format as source from adls gen1.
Planning to use adls gen1 or databricks delta lake(supports parquet format only) as source to get data into power bi.Kindly suggest or please share any documentation link to get an idea related to it.
Power BI does support both Gen 1 and 2 DataLake, however both Power BI and Azure Analysis service does NOT support parquet file formats. You will have to convert them to a text format, like csv or other delimited format to load them from the Datalake
If you are using Databricks, and you have created a table structure based on delta tables, you can connect to those tables from Power BI using the data bricks connector. However you'll only be able to access the tables when the cluster is running.
Some outlines are on the MS Docs site > https://learn.microsoft.com/en-us/power-query/connectors/datalakestorage
You can use Azure Data Lake Storage connector in Power BI under Get Data.
For more details you can refer here

Power BI + Snowflake SSO

Snowflake
PowerBI
Is it possible to use different warehouse in snowflake for data refresh in Power BI than warehouse used for dashboard development?
I'd like to have separate warehouse for data refresh in order to be able to track snowflake expenses.
Let's say I'm using SF_TUTS_WH as a warehouse for development, but I'd like to use PBI_WH as warehouse for data refresh in PowerBI. Is it possible to achieve this or do I have to change snowflake warehouse in PowerBI file before I'll load it to PBI online?
FYI: https://www.snowflake.com/blog/using-sso-between-power-bi-and-snowflake/
One option which will work for you is to have the ware house name paramterized for the reports. The value of the parameter can be updated from the service based on the requirements.Please check out the below video link to see how the parameters can be updated in the power bi service.
https://youtu.be/MJVSu47iMmo
Hope this helps
Thanks ,
Prasanth