I made a dashboard with power bi and I've been using the google storage url to import the pictures, but its taking too much time rendering.
Is there a way to make this faster?
Some faster link to the pictures, or a url that allows me to resize the files.
Related
Years ago I used a BI product called Hyperion Interactive Reporting. It allowed me to connect to a data source and create data models from which I would create reports. So far, sounds like Power BI right?
It also had the ability to connect to a metadata repository database. This database would contain data that mapped the actual, often cryptic, table and column names in the database to human-readable, business terms. For example a column that I saw in Hyperion as "Cost Center" may have been in the database as costCenter, work_order, or PROJECT-NUMBER. (It would also allow me to define the default join paths, but let's keep this question smaller.) This provided a way to make report development easier.
In Power BI, I see that I can manually rename columns, one-at-a-time. (And each time I touch something minor like this, Power BI takes several seconds to validate the entire file.) But I also see the need for many models that use the same data sources. So I may be defining the "Cost Center" column a few hundred times (a handful of reports per data set to answer a specific type of question, a few data sets that need Cost Center because the transformations in the model will be different for each type of question, several different combinations of data sources that include Cost Center, etc.)
Is there a way to connect Power BI to a metadata repository? Is there a way in Power BI to say, "Across all of my models/datasets, if I'm using the costCenter column from the Financial database, display Cost Center to the user"?
With about 20,000 columns in my data warehouse and 20,000 reports in my current reporting system, this could become a big deal if we intend to migrate to Power BI.
TLDR; There isn't an easy way to achieve this. What you have now is probably better than you could achieve without a ton of work. If you do try it, use SSAS instead of Power BI Desktop to author models.
Does Power BI have a metadata repository? No. There are tools that can get metadata from Power BI models, but you would have to manually build the metadata repository. If you want a centrally managed environment like this, I would highly recommend using SQL Server Analysis Services (SSAS) for on premise, or even better, Azure SSAS in the cloud. (Azure SSAS will get new features sooner than SSAS installed on premise.) While Power BI Desktop is a great self-service tool, I wouldn't author in it if I needed to control or report across the environment. There just aren't easy ways to corral all of the Power BI models in a report and it's a much more manual process. SSAS will need IT Support and is a higher cost and you will hit more issues than Power BI Desktop, but you will need it if you need central control. It's possible that more management controls will be added to the PowerBI.com service over time, but as of November 2021, you can't do this easily.
So what's the difference between Power BI Desktop and SSAS? The same Power BI engine in Power BI Desktop also exists in SSAS. When you start Power BI Desktop, it's actually starting a SSAS instance behind the scenes. Using SSAS directly just makes it easier for you to connect to the database behind the scenes and see all the models in the environment from one place, while Power BI Desktop doesn't let you peak behind the scenes and it only loads a single model at a time.
How do you get the metadata? It is an easy thing to get SSAS metadata using Power Query (or any SQL tool) to pull Direct Management View (DMV) data. DMVs are management tables that hold all of the metadata of the model, and you just use SQL commands to get the data. Search on "SSAS DMV" to learn more. I have a Excel file that uses Power Query to pull all the key DMV views for all our models in our servers. It makes it easy to do the kind of analysis as in your example.
For Power BI Desktop, you can connect to the hidden SSAS instance and do the same thing, but the report has to be open to do it, and there is no easy way to refresh the data--you pretty much just repeat the process each time. You will connect via localhost:port_number, and the port number is randomly created each time you start Power BI making it impossible to refresh the data pull. There are External Tools such as DAX Studio, Power BI Helper, and dataMarc's Document Model that make that easier, but there's no easy way to automate building the metadata repository for Power BI Desktop files. I would use SSAS directly rather than trying to automate building a large metadata repository.
What about making changes to models? To my knowledge, there aren't any tools that make it easy to make changes across models, though again, you could manually build them. I don't think I would trust my own tool to automate changes across models though. There's just too much that could go wrong. But if you had the need and the budget, you could build it. Look at tools like Tabular Editor, ALM Toolkit, and Microsoft's SSMS, and read on DevOps pipelines for automating updates. These tools work against SSAS and Power BI Desktop, but again, you have to open the Power BI files to work with those models, which makes automation that much harder to do.
Note that all the external tools I've mentioned except Tabular Editor v3 are free (though Tabular Editor v2 is free). PowerBI.tips is a great place to install all these tools from a single installer.
My company is currently building an enterprise data warehouse in SQL server. We are looking at using PowerBI but I'm struggling to see how PowerBI works in the context of a data warehouse.
For instance what would it offer us, other than nicer looking reports, that Cognos, which we are using now, doesn't? How is it at handling immense amounts of data?
In the context of the Enterprise Data Warehouse Power BI has a number of options.
1) It can be the visualisation layer of your SSAS Data Models, users can connect and quickly create reports as it will sit over, not import data to the Power BI Report. Data processing is done on the server side, and can access huge data models/databases
2) Rather than create SSAS Data Models. Power BI can create a semi-semantic layer, as it is a branch of SSAS Tabluar technology. Your users can quickly deploy the reports, based directly on the database. You can use it in Direct Query mode, as with option 1, this sits over the database, and query processing is dome on the server side. You can import data, but it will be limited to 1GB dataset sizes. All report queries are served from the imported dataset, not the server. With Aggregation Mode you can combine import and direct query to sit over large databases
The real benefit is to enable self-service BI, to get the users to create their own reports. So you can mix strategic (built by the business) and tactical reports (user built). Power BI allows a quick process to mix and match data sources, for example data under your organisation domain, Databases, Cubes, Execl file etc, and data not under your domaim, webpages, API's, other sources.
You can also have Power BI on-prem or in the cloud. On-prem will depend on the SQL Server license type, or it will be another cost. Power BI also fully integrates with O365, and Azure so depending on your application/tech stack, that may be a benefit. It also integrates very well with Power Apps, Power Automate so Power Users can build solutions without requests to IT or others.
This is from my personal experience. I have had a number of projects for enterprise scale customers, that have moved from Cognos (And other tech like Tableau), fully or in part, due to the cost and and the integration of Power BI into O365. End users liked the large knowledge base, the support from MS, and the rapid updating/roadmap of the technology. The most common question is, can it replace X tech. The answer is maybe, it will depend on your report requirements, and how it will integrate with your data sources. Other trends I've noticed, moved some work from IT/BI to the Power Users, particularity with Power Apps/Automate functionality.
Power BI is a lightweight ETL and modeling tool, so it is not just a visualisation tool. There are a number of blogs and articles that compare Power BI to Congnos, that seem biased, so it will be tricky to find a objective answer.
I have a hundreds of power bi reports. Now i want calculate the page
load time of each report daily. Please anyone can suggest a approach
to find the solution.
Write an application to enumerate over workspaces and reports, embed them and measure the time till rendered event is raised.
I saw this Warehouse Dashboard on Youtube (Login as Vendor without any username or password)
https://pbiewarehousedemo.azurewebsites.net/warehouse-summary
and I was wondering how they created this Warehouse Map with the moving items. It seems to be like a real-time position tracker. So I thought I could use this to track my products in a production line to create a digital shadow. Unfortunately they just used it as an example for embedded Power BI in their Youtube video and don't know how this was created.
Does anyone know how to creat such a real-time tracking dashboard? I didn't find anything on the internet.
You would help me a lot if someone had an idea how to do this.
That report is a bit of a cheat, the visual is not an actual visual it is a video.
Real time tracking can sort of be done.
It would be some thing like
Vehicle transmits a GPS to database/api/Azure Function
Data get sent to a database
A function/flow/data factory that refreshes the dataset with the
latest data
That works well for tracking objects around a wide region for a specific location around for example a factory, it would have to be a created custom visual, or maybe a R/Python Visual
Is it possible to load datasets created by the Power BI REST API as data sources in the Power BI Designer? Is this functionality planned? This would be useful for using Power BI Queries to combine data from other sources (e.g. older data in Azure DBs/tables) with the very latest data (e.g. for the current day, hour, etc) that has been loaded via the API.
Also, at the moment it is not possible to perform a selective delete (only Clear All Rows). Is this planned for the future?
Of course we are still in preview, but it seems some more features like this are needed to support production scenarios.
this is something we're considering but it's more of an idea stage at this point. Would you submit this request our support site? https://support.powerbi.com/forums/265200-power-bi
We use the support site to track feature requests to we can track requests and keep you updated when features you're interested in come online.
Thanks,
-Lukasz
http://dev.powerbi.com
http://blogs.msdn.com/powerbidev