I have been tasked to research Power BI Connection to Mainframe Flat files in some cases vsam files
This is needed to replace an existing Legacy BI tool/Reporting that connects to Mainframe
POWER BI does not have a direct connection to Mainframe , so what would be the best way to connect to data sources (Flat files in some cases vsam files, if we need to convert vsam files to flat, we will do it)
Is there any third party tools that can can be used to bridge this gap between Power BI (or any other BI Tool) and the Mainframe data files (our shop already converts Vsam files to Flat files)
Thanks
You could do it my way, but it really only works for smaller, text-based files.
I have some code on the mainframe that converts the data in question into a sensible .csv format. I then compose and send an email on the mainframe to myself from a pre-defined email address (in my case I use "SAVE_TO_SHAREPOINT#<company_name>.com" with the .CSV data as an attachment.
I use Power Automate to pick up emails to myself from that email address and save the attachment to a SharePoint folder that is specified as the subject of the email.
This process can be automated to run whenever it is required. You can then use Power BI to pull in the .csv data fromthe specified SharePoint folder.
This won't work for everyone, but it currently works for me.
You could also use FTP, but you can't FTP to SharePoint, so you'd have to work out how you want the FTP'd data to get into Power BI.
Related
In this scenario, I am using on premise data source like excel file in power bi. Here I have made some ETL process and Calculations as well. Now the problem is, my client want me to do migrate all on premise data sources into sharepoint. For that, If I change the file path in power bi, can I have all the changes I made earlier?
You don't have to change any transfomations if you read Excel via the web connector. However, if you're using the SharePoint Folder connector (recommended), you'll probably have to add an additional another navigation step to access the sheets.
I think I'm on a severe wishful thinking trip here, please confirm if I am, if not then what would be the process to accomplish this?
Every day a team compiles an Excel file and sends it via email attachment. I have a Power Automate Flow that saves the attachment into a SharePoint space.
I can create a Power Bi and manually connect load those files to create the report, but it seems the real "Power" would be to not have to manually connect the new file (which has the creation date in the filename - eg 'Daily DRR 3-16-22.xlsx') every day, ergo:
What steps (using the Power Platform) would I take to have my PowerBi report auto (dynamically) refresh using the last 5 files (days) in the SharePoint drive? Is that possible?
Thanks in advance!
We are using Sisense for our reporting tool.
We have too many clients using Sisense.
This clients have a lot many dashboard , widget.
Sisense store data in mongo db.
I don't have an idea about Microsoft power BI.
Is there any possibility to build migrate tool for Sisense to Microsoft Power BI ?
Thank you.
For Sisense, it stores its meta-data for it in a mongo db instance. However for Power BI it stores it's meta data in the PBIX file. If you change the file extension from pbix to zip, you can navigate in inspect the contents.
When the report is deployed to the Power BI Service, it uses a number of components to store the file and meta data, blob storage and a small SQL instance in the background. You cannot access these items or the data in them.
For on premise versions of Power BI, Power BI Report Server (available in Premium only, or some enterprise licensing), this requires a SQL Server database to be used. This acts as a meta data store for the Power BI Front end and, also stores the files etc for the reports loaded to it. You can access this data meta store. More details on the setup here.
I don't think there is a path to migrate data from the mongo db to the sql, or the service, or the files, it will be a full recreate of the objects from one reporting technology to the other.
Actually, PowerBI uses XML and SiSense uses JAQL; parse the JAQL to create a translator to build rudimentary PowerBI reports. Since SiSense uses Elasticubes, Dashboards and Widgets, you have to parse them all to build out PowerBI. I successfully did this for SSRS and powerBI has a more complex layout but nonetheless, can be done...
Built it in .Net using Newtonsoft to extracts the JAQL (JSON) and then parse to translate to PowerBI..
not that hard
I have a problem:
I have a PBI file containing three data sources: 2 SQL Server sources + 1 API call.
I have separate queries for each respective data source and an additional query that combines all three queries into a single table.
Both SQL Server sources have been added to a gateway and I can set scheduled refresh for each source, if I publish them in separate PBI files.
However, I cannot set scheduled refresh for the file that contains all three sources - both the data source credentials and scheduled refresh options are greyed out.
The manage gateway section of the settings page also shows no gateway options. If I publish the SQL Server data (with no API data) I can clearly see my data source and gateway under the gateway heading.
screenshot of dataset settings
Does anyone have any idea why this might be happening?
Thank you,
I had the same problem.
I have a PBI file with different data sources : SQL Server sources and APIs.
On The PowerBI Service the Data Source Credentials was grayed out, so here's what I did :
Downloaded the file
Refresh the file locally and signed up on all data sources (the Server of DB Server name changed but not for the APIs)
published in the PBI Service
It worked for me.
Same problem here. After additional poking around I learned that the "Web Source" (API call) was the reason for the inability to refresh and can cause "Data Source Credentials" to be inaccessible. This was annoying to learn after diving down several rabbit holes.
Several (weak) workarounds
Using Excel's Power Query to connect to the web source. Learn more about Excel's Power Query.
Make any needed transformations.
Put the Excel file in SharePoint Online folder or other PBI accessible directory.
Connect to the Excel file using the appropriate data source (i.e. SharePoint Folder).
Alternatively, if the data is static, you can directly copy/paste values into PBI (if you just need to get this done and move on with your life):
Copy target values
Open Power Query Editor
Home tab -> Enter Data
Paste values into table
Hopefully this will save some poor soul a little of their life.
As I understand it, Power BI creates its own internal tabular model... but where?
Say I'm working with sensitive data, and the reports will be ultimately published to a Report Server on prem. The reports will only be accessible by selected Active Directory groups. If, during development, I save the pbix file to a network share, or internally e-mail it to a colleague, will it contain the sensitive data in an accessible way? I'm just thinking about ways the information could fall into the wrong hands. Is there an option to automatically save the pbix file with no data?
If you zip a PBIX file (see this reference), you can see that the data is stored in the DataModel file inside the top folder level in a highly compressed format. Though it's compressed, I doubt it's encrypted, so it's likely that someone could theoretically decompress the data if they know what they're doing.
One option would be to export the report as a PBIT instead, which is designed to only save the report structure, relations, queries and such but not the actual data if it comes from external sources.