Automatically Exporting PowerBi Visualisation Data? - powerbi

I need to automatically extract raw data of a PowerBI visualisation across multiple published reports.
Why not just pull the underlying dataset? Because the visualisations are using anomaly detection features of PowerBI, which include anomaly flags not available in the underlying dataset (basically, the visualisations contain calculated columns that are not included in main PowerBI data model)
Ideally a REST API solution would be best, but dumping CSV files or other more roundabout methods are ok.
So far, the closest functionality I can see is in the Javascript API here - https://learn.microsoft.com/en-us/javascript/api/overview/powerbi/export-data, which allows a website to communicate with an embedded PowerBI report and pass in and out information. But this doesn't seem to match my implementation needs.
I have also seen this https://learn.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/tutorials/batch-anomaly-detection-powerbi which is to manually implement anomaly detection via Azure Services rather than the native PowerBI functionality, however this means abandoning the simplicity of the PowerBI anomaly function that is so attractive in the first place.
I have also seen this StackOverflow question here PowerBI Report Export in csv format via Rest API and it mentions using XMLA endpoints, however it doesn't seem like the client applications have the functionality to connect to visualisations - for example I tried DAX Studio and it doesn't seem to have any ability to query the data on a visualisation level.

I'm afraid all information on PowerBI says this is not possible. The API only supports PDF, PPTX and PNG options, and as such the integration with Power Automate doesn't do any better.
The StackOverflow question you link has some information on retrieving the Dataset but that's before the anomaly detection has processed the data.
I'm afraid your best bet is to, indeed, use the Azure service. I'd suggest ditching PowerBI and going to an ETL tool like DataFactory or even into the AzureML propositions Microsoft offers. You'll be more flexible than in PowerBI as well since you'll have the full power of Python/R notebooks at your disposal.
Sorry I can't give you a better answer.

Related

How are you coping up with Bigquery especially when you came from traditional RDMS background like Oracle/Mysql?

I am new to BQ. I have a table with around 200 columns, when i wanted to get DDL of this table there is no ready-made option available. CATS is not always desirable.. some times we dont have a refernce table to create with CATS, some times we just wanted a simple DDL statement to recreate a table.
I wanted to edit a schema of bigquery with changes to mode.. previous mode is nullable now its required.. (already loaded columns has this column loaded with non-null values till now)
Looking at all these scenarios and the lengthy solution provided from Google documentation, and also no direct solution interms of SQL statements rather some API calls/UI/Scripts etc. I feel not impressed with Bigquery with many limitations. And the Web UI from Google Bigquery is so small that you need to scroll lot many times to see the query as a whole. and many other Web UI issues as you know.
Just wanted to know how you are all handling/coping up with BQ.
I would like to elaborate a little bit more to #Pentium10 and #guillaume blaquiere comments.
BigQuery is a serverless, highly scalable data warehouse that comes with a built-in query engine, which is capable of running SQL queries on terabytes of data in a matter of seconds, and petabytes in only minutes. You get this performance without having to manage any infrastructure.
BigQuery is based on Google's column based data processing technology called dremel and is able to run queries against up to 20 different data sources and 200GB of data concurrently. Prediction API allows users to create and train a model hosted within Google’s system. The API recognizes historical patterns to make predictions about patterns in new data.
BigQuery is unlike anything that has been used as a big data tool. Nothing seems to compare to the speed and the amount of data that can be fitted into BigQuery. Data views are possible and recommended with basic data visualization tools.
This product typically comes at the end of the Big Data pipeline. It is not a replacement for existing technologies but it complements them. Real-time streams representing sensor data, web server logs or social media graphs can be ingested into BigQuery to be queried in real time. After running the ETL jobs on traditional RDBMS, the resultant data set can be stored in BigQuery. Data can be ingested from the data sets stored in Google Cloud Storage, through direct file import or through streaming data
I recommend you to have a look for Google BigQuery: The Definitive Guide: Data Warehousing, Analytics, and Machine Learning at Scale book about BigQuery that includes walkthrough on how to use the service and a deep dive of how it works.
More than that, I found really interesting article for Data Engineers new to BigQuery, where you can find consideration regarding DDL and UI and best practices on Medium.
I hope you find the above pieces of information useful.

How can we Export data from Web Published Reports

I have published an application that i did with power Bi, for some charts I want to make data downloadable for users.
I couldn't find any straight forward way to do that, some sites talk about implementing a script for that.
Is there any real way for published application that works?
One of the limitations of Publish to web is the inability to export data:
Reports using row level security.
Reports using any Live Connection data source, including Analysis Services Tabular hosted on-premises, Analysis Services Multidimensional, and Azure Analysis Services.
Reports shared to you directly or through an organizational content pack.
Reports in a group in which you are not an edit member.
"R" Visuals are not currently supported in Publish to web reports.
Exporting Data from visuals in a report, which has been published to the web
ArcGIS Maps for Power BI visuals.
Reports containing report-level DAX measures.
Single sign-on data query models.
Secure confidential or proprietary information.
The automatic authentication capability provided with the Embed option doesn't work with the Power BI JavaScript API. For the Power BI JavaScript API, use the user owns data approach to embedding. Learn more about user owns data.
Exporting data is possible, if you publish your report in Power BI Online and share it with your colleagues. But keep in mind, that even in this case it may not be possible or allowed, due to these limitations, e.g. if it has been disabled, or the user doesn't have enough permissions. In addition, it has limits on maximum columns, rows and data size.
The best option is to export the data directly from the data source, which is used to build this report.

Power BI Desktop vs Web Client

What is the difference between the Power BI desktop client and web client. Both seem to have the same features. What can the desktop client do that the web client cannot do?
I'm not going to be exhaustive since there are a ton of features in both experiences. The Power BI Desktop is intended as a tool for Analysts to work with data. It includes data load, mashup, data modeling, and reporting capabilities. You can create models with relationships, calculated columns and DAX measures. You can create crazy transforms to manipulate the data to shape it into as good shape or merge data from multiple sources into a single data model. The web version of reports really focuses on the reporting piece. If someone else is doing all the data modeling for you, then the web reporting UI is pretty comprehensive. If you need to do the data modeling yourself, then Desktop is the way to go. Desktop does have the added benefit of a file you can save or archive. It doesn't support the direct query sources or push data sets like the web report feature. So there are at least some limitations. Which you use really depends on the types of problems you're trying to overcome.

Create simplified version of tableau using Google Maps

Is it possible to create simplified version of Tableau (Just basic visualization) using Google Chart Tools? If so, what are main challenges in imitating Tableau?
What are the advantages of Tableau over Google Chart Tools?
I would recommend reading a bit what Tableau and Google Chart actually does.
Google Chart is a library to create visualisations. Basically you need to have your data ready in the correct format and then with the API you can create and customiza a graph to publish on the web.
Tableau on the other hand gives you the possibility to directly connect to different data sources, blend and join these sources and then create dashboards that give you an overview of whatever data you would like to view.
So to answer your first question
Is it possible to create simplified version of Tableau (Just basic visualization) using Google Chart Tools?
Yes it is possible to create basic visualisation with Google Charts (that's what it's made for) but that's also possible with Excel (although that won't give you the interactivity online).
If so, what are main challenges in imitating Tableau?
You will not be able to join and blend data. You also won't have a fairly easy to understand user interface, since Google Charts is handled on code level. You will also not be able to just "play around" until you see something interesting, at least not as easily as in Tableau
What are the advantages of Tableau over Google Chart Tools?
that depends on your use case. If you want to get some pretty graphs on your website, Tableau won't be able to do the job and you shoudl use Google Charts.
If you want to have complex, interactive dashboards to examine your data from different sources, you will struggle getting all of that together just with Google Charts and you should probably have a look at Tableau.

Power BI Datasets from REST API and PowerBI Designer

Is it possible to load datasets created by the Power BI REST API as data sources in the Power BI Designer? Is this functionality planned? This would be useful for using Power BI Queries to combine data from other sources (e.g. older data in Azure DBs/tables) with the very latest data (e.g. for the current day, hour, etc) that has been loaded via the API.
Also, at the moment it is not possible to perform a selective delete (only Clear All Rows). Is this planned for the future?
Of course we are still in preview, but it seems some more features like this are needed to support production scenarios.
this is something we're considering but it's more of an idea stage at this point. Would you submit this request our support site? https://support.powerbi.com/forums/265200-power-bi
We use the support site to track feature requests to we can track requests and keep you updated when features you're interested in come online.
Thanks,
-Lukasz
http://dev.powerbi.com
http://blogs.msdn.com/powerbidev