Power BI extract .pbix and decode DataModel from XPress9 - powerbi

How can I automate the exporting of .pbix to .pbit without Microsoft services? Some hacky solutions using autogui, power automate amongst others. Otherwise does anyone have a python solution or any solution to decode DataModel file from .PBIX XPress9 Algorithm? Currently the only way to inspect a .PBIX is to export to .PBIT from within PowerBI Desktop.
Sadly, all current solutions point to Microsofts Sql Server Analysis Services (SSAS).
Power BI .pbix files are just zipped files. When exporting all files from the .pbix, the DataModel file is a binary encoded with XPress9 algorithm. The DataMashup file no longer exists as of August 2021 Power BI version 2.72.5556.801.
Here is publication on Xpress9 for those who are interested.

Both .pbix and .pbit are zipped files under the hood, so you can proceed the same way with both once you get a hold on them. For the record, the DataMashup file is still very much in place for me as we speak (October 2022).
As it turns out, a .pbit file is nothing more than a .pbix files with the DataModel file removed. So you can automate the exporting by getting the .pbix first (through the APIs or otherwise), unzip the archive, remove the DataModel files, rezip it and rename to .pbit, simple as that.
Now, if you are interested in decoding the DataModel file, this is another, and more complicated, story. AFAIK, there is not way to decode XPress9 outside of Microsoft.

Related

Bulk upload / publish multiple power bi's from folder

i found a way to download all .pbix files in one workspace via powershell. is there also way to publish multiple power bi files in to one workspace (all files will be stored in one folder)?
Thanks for any Help!
i found some ideas to do this via Power BI REST API - but didn't really understand it...
You want the New-PowerBIReport cmdlet, which "Loads a Power BI report from a pbix file and deploys it to the Power BI service."

Need to change the file path from on premise to sharepoint in power bi

In this scenario, I am using on premise data source like excel file in power bi. Here I have made some ETL process and Calculations as well. Now the problem is, my client want me to do migrate all on premise data sources into sharepoint. For that, If I change the file path in power bi, can I have all the changes I made earlier?
You don't have to change any transfomations if you read Excel via the web connector. However, if you're using the SharePoint Folder connector (recommended), you'll probably have to add an additional another navigation step to access the sheets.

Power BI and parquet at ADLS Gen2

I'm able to connect to ADLS Gen2 from Power BI Desktop and work on CSV files.
The issue is that the same doesn't work for Parquet format. Have you ever worked with parquet at Power BI Desktop?
The problem arise when after adding parquet table, I click on Binary reference - Power Query is unable to read/preview parquet data. I tried both with and w/o snappy compression.
Also I tried to write query manually:
let
Source = AzureStorage.DataLake("https://xxx.dfs.core.windows.net/yyy/data.parquet"),
#"File" = Source{[#"Folder Path"="https://xxx.dfs.core.windows.net/yyy/data.parquet",Name="data.parquet"]}[Content],
#"Imported File" = Parquet.Document(#"File")
in
#"Imported File"
But got the following exception:
The name 'Parquet.Document' wasn't recognized. Make sure it's spelled
correctly.
Despite the fact that Parquet.Document function is documented. I'm using Poewr BI Desktop latest version (Dec 2019).
P.S. I've also faced the same issue while developing DAX model for AAS from Visual Studio SSDT.
Power BI supports this natively now.
Just paste in the url to the parquet file on your lake/storage account and you're good to go. Apparently this isn't slated to go live until March 2021, but it appears for me in the Dec 2020 release.
Currently, you can't work directly with parquet files in Power BI Desktop. You'll need to leverage something like Azure Data Factory's wrangling data flows to convert to CSV or another consumable format first.
It looks like the function you're referring to was specifically added for this new feature in Azure Data Factory, which allows usage of Parquet files in wrangling data flows.
This might come soon for the Power BI Service's dataflows, too, but that's speculation on my part.
I have been able to successfully read parquet files stored in ADLSG2 via a Power BI Dataflow.
Unfortunately you cannot progress to completion via the gui; Parquet format is not natively detected as a source datatype at the time of this writing. To get around the issue, just use the advanced query editor (in order to progress to the advanced editor, just select the JSON or alternative datatype, then overwrite the M code in the Advanced query editor).
Note: This does not currently work with the June 2020 release of PowerBI Desktop. It only works via a dataflow from what I can tell:
let
Source = AzureStorage.DataLake("https://xxxxxxxxxx.dfs.core.windows.net/Container"),
Navigation = Parquet.Document(Source{[#"Folder Path" = "https://xxxxxxxxxx.dfs.core.windows.net/yourcontainer/yoursubfolder/", Name = "yourParquetFile"]}[Content]),
#"Remove columns" = Table.RemoveColumns(Navigation, Table.ColumnsOfType(Navigation, {type table, type record, type list, type nullable binary, type binary, type function}))
in
#"Remove columns"

Migrating a Power BI Desktop file to Analysis Services

Is it possible to convert a .pbix file into a .bim file while preserving all the data connections and expressions / shared expressions?
I have a power bi file I’ve exported into a .pbit file then loaded into Tabular Editor and saved as a .bim file and then loading the .bim file into SSDT visual studio 2015. My compatibility level is 1400.
The problem is that when I am converting from .pbix into .pbit that I lose data connections and shared expressions. The data connections are saved as “mashup” connection strings inside the database which reference back to the instance of power bi desktop I had open
How can I have these data connections remain as Oracle or SQL server connections?
You can import a .pbix file into Azure Analysis Services. At that point, it becomes a regular Tabular model that you can download as an SSDT project (including the Model.bim file). However, you'll have to pay for the Azure Analysis Services instance during this operation.
Other than that, I guess you could ask the author of Tabular Editor to provide this functionality.

When I am working on a Power BI report using Power BI Desktop on my local machine, where is the data stored?

As I understand it, Power BI creates its own internal tabular model... but where?
Say I'm working with sensitive data, and the reports will be ultimately published to a Report Server on prem. The reports will only be accessible by selected Active Directory groups. If, during development, I save the pbix file to a network share, or internally e-mail it to a colleague, will it contain the sensitive data in an accessible way? I'm just thinking about ways the information could fall into the wrong hands. Is there an option to automatically save the pbix file with no data?
If you zip a PBIX file (see this reference), you can see that the data is stored in the DataModel file inside the top folder level in a highly compressed format. Though it's compressed, I doubt it's encrypted, so it's likely that someone could theoretically decompress the data if they know what they're doing.
One option would be to export the report as a PBIT instead, which is designed to only save the report structure, relations, queries and such but not the actual data if it comes from external sources.