Is it possible to convert a .pbix file into a .bim file while preserving all the data connections and expressions / shared expressions?
I have a power bi file I’ve exported into a .pbit file then loaded into Tabular Editor and saved as a .bim file and then loading the .bim file into SSDT visual studio 2015. My compatibility level is 1400.
The problem is that when I am converting from .pbix into .pbit that I lose data connections and shared expressions. The data connections are saved as “mashup” connection strings inside the database which reference back to the instance of power bi desktop I had open
How can I have these data connections remain as Oracle or SQL server connections?
You can import a .pbix file into Azure Analysis Services. At that point, it becomes a regular Tabular model that you can download as an SSDT project (including the Model.bim file). However, you'll have to pay for the Azure Analysis Services instance during this operation.
Other than that, I guess you could ask the author of Tabular Editor to provide this functionality.
Related
I have been tasked to research Power BI Connection to Mainframe Flat files in some cases vsam files
This is needed to replace an existing Legacy BI tool/Reporting that connects to Mainframe
POWER BI does not have a direct connection to Mainframe , so what would be the best way to connect to data sources (Flat files in some cases vsam files, if we need to convert vsam files to flat, we will do it)
Is there any third party tools that can can be used to bridge this gap between Power BI (or any other BI Tool) and the Mainframe data files (our shop already converts Vsam files to Flat files)
Thanks
You could do it my way, but it really only works for smaller, text-based files.
I have some code on the mainframe that converts the data in question into a sensible .csv format. I then compose and send an email on the mainframe to myself from a pre-defined email address (in my case I use "SAVE_TO_SHAREPOINT#<company_name>.com" with the .CSV data as an attachment.
I use Power Automate to pick up emails to myself from that email address and save the attachment to a SharePoint folder that is specified as the subject of the email.
This process can be automated to run whenever it is required. You can then use Power BI to pull in the .csv data fromthe specified SharePoint folder.
This won't work for everyone, but it currently works for me.
You could also use FTP, but you can't FTP to SharePoint, so you'd have to work out how you want the FTP'd data to get into Power BI.
In this scenario, I am using on premise data source like excel file in power bi. Here I have made some ETL process and Calculations as well. Now the problem is, my client want me to do migrate all on premise data sources into sharepoint. For that, If I change the file path in power bi, can I have all the changes I made earlier?
You don't have to change any transfomations if you read Excel via the web connector. However, if you're using the SharePoint Folder connector (recommended), you'll probably have to add an additional another navigation step to access the sheets.
We are using Sisense for our reporting tool.
We have too many clients using Sisense.
This clients have a lot many dashboard , widget.
Sisense store data in mongo db.
I don't have an idea about Microsoft power BI.
Is there any possibility to build migrate tool for Sisense to Microsoft Power BI ?
Thank you.
For Sisense, it stores its meta-data for it in a mongo db instance. However for Power BI it stores it's meta data in the PBIX file. If you change the file extension from pbix to zip, you can navigate in inspect the contents.
When the report is deployed to the Power BI Service, it uses a number of components to store the file and meta data, blob storage and a small SQL instance in the background. You cannot access these items or the data in them.
For on premise versions of Power BI, Power BI Report Server (available in Premium only, or some enterprise licensing), this requires a SQL Server database to be used. This acts as a meta data store for the Power BI Front end and, also stores the files etc for the reports loaded to it. You can access this data meta store. More details on the setup here.
I don't think there is a path to migrate data from the mongo db to the sql, or the service, or the files, it will be a full recreate of the objects from one reporting technology to the other.
Actually, PowerBI uses XML and SiSense uses JAQL; parse the JAQL to create a translator to build rudimentary PowerBI reports. Since SiSense uses Elasticubes, Dashboards and Widgets, you have to parse them all to build out PowerBI. I successfully did this for SSRS and powerBI has a more complex layout but nonetheless, can be done...
Built it in .Net using Newtonsoft to extracts the JAQL (JSON) and then parse to translate to PowerBI..
not that hard
I have an existing PowerBI report that imports data from an SQL Server analytics services database. This is working fine and I can schedule automatic refreshes using the Gateway provided by my organization.
I would now like to add some additional, but rarely changing data, that I only have in a local Excel file. When I do add this data, the report stops refreshing automatically and complains, that it has no gateway to refresh this Excel file.
What I would like is that Power BI is refreshing the data of the SQL Server analytics services database, but just keeps the existing Excel file without updating it. - I will upload an updated version of the PowerBI report if I need to change the data in the Excel file.
Is that possible? I couldn't find out how. I was trying to upload the Excel file to a different dataset to the Power BI service and reference this dataset in my report. Just to find out, that I cannot access a different Power BI dataset and SQL server analysis services database from the same report.
Three things I can think of
Upload the file to onedrive/sharepoint so that it's accessible online (per Dev's answer)
If the data is simple enough, you can add the data directly into PowerBI itself and skip the Excel file entirely.
You can disable the Excel file refresh so that PBI does not try and refresh(and thus access) the local Excel file. (Not sure if this will work)
I had a similar issue I came across. Yes, you can just use Enter Data to add a table, but you can only build something with less than 3000 cells, so you'd have to merge several tables if something was larger than that.
Turning off the report refresh in the suggestion above (#3) still requires a gateway, unfortunately.
I just created a dataflow and plopped the data from my csv there. You'll have to create a connection and refresh it, but you don't need to schedule a refresh there, so no need to create a gateway.
Then just link the dataflow as a source to your .pbix file and setup your gateway to point at the dataflow.
Is there a difference between looking at the Power BI Desktop File (.PBIX extension) and looking at the report created when you go to Power BI Report Server and click on "Edit in Power BI Desktop"?
Will everything be exactly the same or would it be better to just look at the .PBIX file if I want to see exactly how the report was created?
I am unable to test this myself at this time.
When you open the file that has been uploaded in the Power BI (PBI) Service, you can be certain that this your live file that is currently being used in production.
When you open a .PBIX file on computer using the PBI Desktop, it might be the same file as the file downloaded from PBI Service, but there's no guarantee that no changes have been made to the file.
So if you want to know the current live file looks like, always download the .PBIX file from PBI Service.