Use Multiple google bigquery service accounts in Power BI (Service & Desktop) - google-cloud-platform

I'm trying to configure multiple Dataset in Power BI Service to use different service accounts for different projects GCP.
Every time i publish a new dataset using a different google service account , it is always associated with the previous bigquery datasource overriding the credentials.
Fits this use case :

Have same issue, microsoft says it is products limitation. and suggested to use Power BI Data Gateway.
it works fine with power bi data gateway.
I do understand a data gateway is not required as GCP is Cloud source but we have considered this as a interim setup.

For each dashboard you have to set "Confidentiality level setting for this data source" to "Private".

I do have the same issue.
Trying to apply your solution, unfortunately this is not working. when changing project A cred, it Changes project B cred.
Additionally, I am wondering how would it be if we were tyring to get 2 projects on the same dataset
Best Regard
Olivier LN (LinkedIn)

Related

Power BI Embedded Approach for 100s of SQL Targets

I'm trying to find the best approach to delivering a BI solution to 400+ customers which each have their own database.
I've got PowerBI Embedded working using service principal licensing and I have the PowerBI service connected to my data through the On Premise Data Gateway.
I've build my first report pointing to 1 of the customer databases. Which works lovely.
What I want to do next, when embedding the report, is to tell PowerBI, for this session, to get the database from a different database.
I'm struggling to find somewhere where this is explained, or to understand if this is even possible.
I'm trying to avoid creating 400+ WorkSpaces or 400+ Data Sets.
If someone could point me in the right direction, it would be appreciated.
You can configure the report to use parameters and these parameters can be used to configure the source for your dataset:
https://www.phdata.io/blog/how-to-parameterize-data-sources-power-bi/
These parameters can be set by the app hosting the embedded report:
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-parameters-in-group
Because the app is setting the parameter, each user will only see their own data. Since this will be a live connection, you would need to think about how the underlying server can support the workload.
An alternative solution would be to consolidate the customer databases into a single database (just the relevant tables) and use row level security to restrict access for each customer. The advantage to this design is that you take the burden off of the underlying SQL instance and push it into a PBI dataset that is made to handle huge datasets with sub-second response times.
More on that here: https://learn.microsoft.com/en-us/power-bi/enterprise/service-admin-rls

Power BI :: Some credentials aren't validated as they are set to skip test connection

My problem is very easy: I'm retrieving all stocks about lithium on the ASX market from this web page.
I created a Power BI dashboard that you can visualize here.
As the source is Web I would like to refresh it every day using the Gateway connection, so I have successfully installed it on my computer that I turn on every day:
The problem is that instead of a successful daily refresh I have a daily error:
Failed to test the connection to your data source. Please retry your credentials.
Some credentials aren't validated as they are set to skip test connection.
If I enter in the Edit credentials I see the error:
Failed to update data source credentials: The credentials provided for the Web source are invalid. (Source at https://www.marketindex.com.au/commodities/lithium.)Show details
This because I'm using Authentication method: Anonymous and Privacy level setting for this data source: Public. But this is exactly what I'm using on Power BI desktop and it's working!
What am I doing wrong here?
Why the same connection is working fine on Power BI Desktop but not the same dashboard uploaded online?
EDIT: the On-premises data gateway looks installed without problems
It seems like the issues you're experiencing may be caused by the link you're using (https://www.marketindex.com.au/commodities/lithium), although I'm not sure exactly why.
For example, as has been mentioned in the Do I Need a Gateway for Web post, one can use Web.Contents instead of Web.BrowserContents in the raw M code in the Power Query Editor to avoid using a gateway. However, when I test this implementation using the link that you provide, I get an error when refreshing from Power BI online.
If I test this implementation using an HTML table from the https://tradingeconomics.com/commodity/lithium website, though, and set the data source credentials as follows, then I am able to successfully refresh the dataset from Power BI online.
I tested this with another HTML table from a random webpage and an online refresh worked as well. So it seems like there is something odd about how Power BI online interacts with the Market Index website and this is causing your issue. Potentially because of some authentication configuration on the Market Index side of things? I'm not able to help with looking further into why this is occurring, though.
Would you be able to use a different website with similar information as a data source for your report? If so, you could avoid the issues you're experiencing and avoid using a gateway in general if you wanted.
If you go this route, it seems like using the included "Scheduled refresh" setup in Power BI online works, but I will include the Power Automate scheduled refresh setup below in case this is desired instead.
Power Automate Scheduled Refresh
One can use Power Automate to "schedule" dataset refreshes instead of the included "Scheduled refresh" functionality in Power BI online. The steps are a little too long to include here, but the Refresh your Power BI dataset using Microsoft Flow article outlines the process on setting this up. There are various triggers that you can use to trigger your flow. The linked article uses one that triggers based on a SharePoint action, but there is a Recurrence trigger that can be used to trigger your flow at regular intervals/times.
Just set up your flow, make sure that it is "turned on", and your dataset will refresh at regular intervals from the cloud.

Can we use button to change data source in power bi?

I am new in power bi. I am using SQL connection for data load in power bi.
I created the report in Dev environment. But I want to use the same report in all environment(dev/test/uat/prod).
Question: Is it possible to switch the connection via button click in dashboard?
You'll have to use a parameter to select the connection and store the report in template format - *.pbit. Then you can easily create different versions of the report from the template by specifying the according parameter setting.
The only way to use a slicer for changing the environment would be to load the data from all different environments into the model first - which is clearly not recommended.
Power BI offers Deployment Pipelines for this purpose. This tool will allow you to create 3 workspaces for dev, test and production stages. Then you can deploy from one stage to another by clicking a button in Power BI Service or using the REST API. In the pipeline you can define rules for dataset and parameters, which can be used to automatically change the datasource when deploying to the next stage, i.e. to change the datasource from your dev database to the test database, or from test database to production one.
You can also implement similar functionality using the API. See for example this answer.
Its tricky question.
Try with above answers, if those not work try with these approach.
I don't think for the moment they didn't implement solution for that.
From my experience I had to create 3 dashboards and gateways to dev, test and prod dashboard.
If your dev,test,prod database column names are same you just copy past your dashboard and rename it according to that.
Then go to change data source and add new test env host and change schema to test env.
If you get few errors you have to resolve , check column names, host and finally you have to sync your data.
You can use same approach for the prod env .
once you publish, you can point to gateway for dev,test or prod environment.
Note: Establish gateway on your server.

PowerBi unable to set Refresh schedule for dataset

We are using PowerBI online. we purchased pro license.
I checked for my login and I am in Global Administrator role.
Still I found Refresh schedule on Dataset option disabled for me?
It is disabled, because your dataset is not refreshable. You should check What's supported? section of Configuring scheduled refresh article. You didn't mention what is the source of your dataset. In general, Power BI service needs access to the data source to be able to refresh the dataset. This means that either your data source should be somewhere in the cloud (e.g. Azure SQL Database, file on OneDrive and SharePoint Online, etc.), or there is a data gateway installed and configured.

Power BI Gateway on-prem server

I have a MSSQL server on-prem and followed the steps from Microsofts Docs
in order to install the gateway.
Than i managed the gateway via manage gateways within power bi (cloud) and the test there worked and i got the green check.
Now the question how do i get the data from the gateway into a dataset? when i try to add a new dataset there and click on data bases i only get Azure SQL Database, Azure SQL Data Warehouse, SQL Server Analysis and Spark on Azure HDInsight but now sign of my fresh data gateway to my on-prem db.
Currently you can only access on premise SQL Server databases by building a model & publishing it to powerbi.com using the Power BI Desktop.
If you select "Import" as your connection option, Power BI will store the required data online. You will be able to query on that fixed data and work with it even when your connection to the on premises server is down. You can update your fixed dataset with new data if your connection is properly defined in your gateway.
If you select "DirectQuery" in your Power BI Desktop you will require an active connection to the On premises server for any reporting to work online. On the up side, it'll work on live data and will not store any data online.
When publishing, Power BI Desktop will try to match your connections used in your report to the ones defined in your gateways. If it can find a match, it'll automaticly link those up. It'll give you an informational message to let you know if this step succeeded after publishing. At the same time it'll add a dataset using the same data you used in the reports you just built.