Can SSO be used to create a dataflow that will reside in the PBI service? - powerbi

Our client would like us to use dataflows for data reuse and other reasons. We will be connecting to a Snowflake database from PBI service. However, they also want to be able to use SSO (Single Sign-On). So, when a user creates a dataset referencing a dataflow, they want the credentials from the currently logged in user to be picked up via SSO and passed along to Snowflake when the dataflow retrieves data from Snowflake. I don't think this can be done but I wanted to verify.
BTW, I know that SSO can be used with PBI Desktop. Just curious if dataflows can use it.

Yes, it seems is possible to use DataFlows with SSO for Snowflake. I am coming to this conclusion from the following reference : https://learn.microsoft.com/en-us/power-query/connectors/snowflake
which includes DataFlows under Summary - Products.

Related

Google Merchant Center - retrieve the BestSellers_TopProducts_ report without the BigQuery Data Transfer service?

I'm trying to find a way to retrieve a specific Google Merchant Center report (BestSellers_TopProducts_) and upload it to BigQuery as part of a specific ETL process we're developing for a customer we have at my workplace.
So far, I know you can set up the BigQuery Data Transfer service so it automates the process of downloading this report but I was wondering if I could accomplish the same with Python and some API libraries from Google (like python-google-shopping) but I may be overdoing it and setting up the service is the way to go.
Is there a way to accomplish this rather than resorting to the aforementioned service?
On the other hand, and assuming the BigQuery Data Transfer service is the way to go, I see (in the examples) you need to create and provide the dataset you're going to extract the report data to so I guess the extraction is limited to the GCP project you're working with.
I mean... you can't extract the report data for a third-party even if you had the proper service account credentials, right?

Power BI security

I am new BI and working on power bi reports embedded into user application. How can I explain users about how data is secured in power bi. I am getting many questions about security. Can you kindly explain how you explained to your customers?
Thanks alot
Well,
Once you embed(Publish) a report to website, all the people who can access that website link can access the report and data.
So its a way to share Power BI report for free.
As you are using reports embeded in another application, if that application has integration with Azure then we can use Azure AD with that. Moreover, You can implement RLS, OLS and data masking in pbi, even in embeded mode.
Internally, datasets hosted in the PBI service are using Azure SQL in an encrypted mode. Even the credentials used to get data are encrypted.
If you want to explain customers, use the whitepaper of security.
https://learn.microsoft.com/en-us/power-bi/guidance/whitepaper-powerbi-security

How can Azure Data Factory access a Custom Data Connector

I've just started to look at Azure Data Factory as a possible way to get data we are currently consuming for Power BI via custom connectors, primarily to access Graph APIs. I can't see if the same data is available to Azure Data Factory. Is there any way to achieve this?
Azure Data Factory has a number of different features which may help:
Web activity - call REST APIs from ADF pipeline; can only access public URLs
Webhook activity - call endpoints and pass callback URL
Azure function - run Azure functions in the pipeline; functions are very flexible so could probably do this
Custom activity via Azure Batch - run .net via Azure Batch; very customisable
Databricks notebook - call a notebook written in Scala, Python, R, Java or SparkSQL - completely customisable
Alternately look at Power BI Data Flows which offers self-service ETL but remember the destination for your "L" is only really Azure Data Lake Gen 2 and Power BI Datasets.
We decided to use Logic Apps, rather than Data Factory, which offer a convenient means to access Graph APIs, as Logic Apps support OAuth well i.e. we're not using Data Connectors any more
In addition, we put some of the more complicated logic into Stored Procedures, as Logic Apps, despite their name, can only handle basic logic

How can we Export data from Web Published Reports

I have published an application that i did with power Bi, for some charts I want to make data downloadable for users.
I couldn't find any straight forward way to do that, some sites talk about implementing a script for that.
Is there any real way for published application that works?
One of the limitations of Publish to web is the inability to export data:
Reports using row level security.
Reports using any Live Connection data source, including Analysis Services Tabular hosted on-premises, Analysis Services Multidimensional, and Azure Analysis Services.
Reports shared to you directly or through an organizational content pack.
Reports in a group in which you are not an edit member.
"R" Visuals are not currently supported in Publish to web reports.
Exporting Data from visuals in a report, which has been published to the web
ArcGIS Maps for Power BI visuals.
Reports containing report-level DAX measures.
Single sign-on data query models.
Secure confidential or proprietary information.
The automatic authentication capability provided with the Embed option doesn't work with the Power BI JavaScript API. For the Power BI JavaScript API, use the user owns data approach to embedding. Learn more about user owns data.
Exporting data is possible, if you publish your report in Power BI Online and share it with your colleagues. But keep in mind, that even in this case it may not be possible or allowed, due to these limitations, e.g. if it has been disabled, or the user doesn't have enough permissions. In addition, it has limits on maximum columns, rows and data size.
The best option is to export the data directly from the data source, which is used to build this report.

Transfer file from AWS S3 to OneDrive with AWS Lambda

A client of ours requested that we have copies of their files on both AWS S3 and OneDrive.
The usual MO: File is sent from an iOS application to an AWS S3 bucket. This triggers an AWS Lambda Function which attaches the file to an email and sends a copy to the client, which they again store on OneDrive. Now, we want to skip the email part and transfer the file directly to OneDrive.
All my research so far points to Zapier or CloudRail or MS Graph REST Api. The problem I'm having is that we want to transfer the file with an AWS Lambda function (Java8), automagically. Almost all the tutorials and examples on MS Graph needs a client to log in manually. Mostly client side logic. The other methods have more overhead, and we don't (unnecessarily) want to make our stack more complicated than it already is.
I realize this is a very specific case. We are systematically replacing the client's file management system, without disrupting their day-to-day operations too much.
Any conclusive pointers/examples/tutorials to get this done server side would be greatly appreciated.
I'm not sure how well S3 aligns with OneDrive, they are quite different models. OneDrive is provisioned by user which begs the question, which user would you want to copy this file too? I would think Azure Storage would be a far better fit as it uses a similar model to S3.
You can use Microsoft Graph API to upload the file to a user's OneDrive. You would need to authenticate the user in order to obtain an Access and Refresh Token. Once this process is done, you can store that Refresh Token and retrieve an updated Access Token as needed.
Also with CloudRail it's necessary to authenticate the user, but there are methods to store and use an access token.
The services have two methods, loadAsString and saveAsString, and they are used to store and load credentials. You could call loadAsString with your access token, the string can be different from service to service, but will look something like this: [{“access_token”: “YOUR ACCESS TOKEN”}]
To add to this, Microsoft now has a cloud migration tool www.mover.io that allows you to sync files & folders from most clouds into Azure blob, Sharepoint or OneDrive directly, so without download/upload to a client machine.
Personally used it only for a one-time sync, but leaving it here for posterity.
The client only has to login once so if you already have the client and secret keys, you can do the manual flow once then save the generated token file together with your code files in AWS. Next time the code is ran, it uses the refresh token. Last time I did this I was able to set the refresh token to never expire but I think Microsoft has randomly removed that option and now the token can only last something like 2 or 3 years max