Power BI and Azure Document DB - powerbi

I have connected to Azure Document DB by Power BI ,but it is taking too much time for data to load and even more time to apply the queries ...Is there any way to reduce this data loading time??

This is more of a generic question around Power BI and even broadly BI tools. But, in general, you have to specify more filters to the queries (yes, they can be edited even in plain SQL). Azure Cosmos DB is super-fast, it all depends on how much data you're trying to query. Also, make sure data is in the region from which users are accessing it in Power BI.

Related

Power BI Paginated report using power BI dataset from multiple azure sql servers

Just looking for a pointer as to the best way to go about this.
I'm comfortable with Power BI Report Builder (SSRS experience), but am pretty much a Power BI novice.
Basically, we have to create a Paginated (non-interactive) report for client consumption. It's going to be large, have multiple datasets, and use parameters / presence of data in the data sets to group data and/or turn sections on or off.
Not too much visualisation - some illustrative graphs and tables here and there - and quite a bit of text, some of it with data / text inserted via placeholders from the various datasets.
There are 3 Azure SQL databases I need to combine data from for this, (split roughly into config, data and results).
In SSRS / SQL Server, I would have used one of my databases as the data source, and written a stored procedure per SSRS data set, joining to tables in other databases in the stored procedure query.
Then in Report builder just set up the data sets joining to the stored procs and gone from there.
On Azure SQL Server, I think I've got 2 options:
write elastic queries so I can bring in the data I need from each database, but just query on one database.
Build a Power BI Model / Dataset that joins the relevant tables from the 3 databases together, publish to power bi service and use that as my datasource.
What's the best solution for my reporting scenario?
Cheers

Power BI and Shared datasets how to allow users to create new measures and reports and publish

We are having difficulty finding a method of sharing a dataset and allowing users to use that dataset to create and publish their own reports. This would include ability to create new measures (Dax) and then publish themselves. Using the "service" live connection does not seem to allow that and if not using that there seems to be an issue of refreshing the data once that dataset is downloaded and modified with new columns/measures etc. 
Greatly appreciate any help on this. So far I have seen nothing that shows how to do any of this so I have to assume it may not be possible? Thank you. 
Live Connect to a Power BI Dataset allows for local measures.
If you need more modeling changes when working with a remote Data Set, the DirectQuery for Power BI Datasets and AAS feature (currently in preview) enables you to mash-up remote Data Set tables, with local tables, and allows for adding calculated columns to remote tables.
But you should use this with some care, as the query processing is split between the local model and the remote model(s), which can cause performance issues.

Can I freeze the data I have in a PowerBI Dataset to use it offline?

I have a power BI dataset that takes its data from a software made by the IT team in my organization.
I was wondering if it was possible for me to "freeze" all the data in the PBI dataset (like, taking a picture of the data for exemple today) and use this dataset for further analysis (I have another power BI file linked to that Power BI dataset). I know the data won't refresh, but it's not important for what I need to do, as I only need to have the past info.
The reason why I need to know if that's possible is that I'm going oversea for one month and won't have access to the original dataset. Downloading all the data into one excel is impossible as it is way to big.
thanks
It sounds like you're after some sort of snapshotting functionality
If you just wanted to keep the file as is, then you can download the pbix and just not refresh it provided its in import mode.
However one approach you could take if you want to continue doing development without worrying about accidentally refreshing is to use a power bi dataflow
You could copy your power query queries to a dataflow. Refresh them all as at today. Then don't refresh the dataflow anymore
You can then point your power bi dataset to your dataflow
https://learn.microsoft.com/en-us/power-bi/transform-model/dataflows/dataflows-create
That way if you wanted to do further transformation of data, you wouldnt be getting new data from the data source (so long as you dont refresh the dataflow)

How to deploy Power BI reports and connect them to a single Power BI Dataset

As far as I know, deploying a Power BI report from Power BI Desktop results in two items, the report itself and the dataset. When deploying a new report using the same dataset, will deploy the new report and a second copy of the same dataset in Power BI Service. That is not what I wanted. To not confuse end users and other, I want only an unique dataset deployed.
I want to make use of Azure Devops deploying to Power BI Service in a Dev, Test and Prod way. The dataset will be an azure analysis services data model, but the principle should be the same. I need to reduce the dataset to be exactly one and all reports must relate to that data model. I have heard of a Rest API or powershell scripting that can come to a rescue here.
So if any of you have done this or know of a good article that describes how to do this, I would be grateful.
Regards Geir
The best option is to separate the Power BI report in the frontend and the backend. You create a file purely for the dataset if you are importing, no reports created on it. You can then create the reports, using the service connection to the dataset, or with Power BI desktop, in the connection to Power BI Dataset option. Both will use 'Live Connection' mode, so you cannot add any other data sources to the model, for example bring in a CSV file or SQL database.
If you are connecting to an Azure Analysis Service data model, you can use this approach, however as it is only a connection only, not a full fat dataset, it should not be an issue to have copies of the dataset, as it is just the connection. Having copies of the dataset is only an issue if you are importing data, then it is best to move things to data flows, and use the same front/back end method, and the planning around the scheduling of the dataflows then datasets
You can use the REST API to move reports and the datasets that they connect to, and move items to new workspaces. If you have Power BI Premium that has a life cycle tool to move items between dev/test/live workspaces
If you create a report in desktop and choose 'Power BI Dataset' as live connection to work over it - when you upload the report to the same workspace, it will only upload the report and connect to the same dataset
https://radacad.com/power-bi-shared-datasets-what-is-it-how-does-it-work-and-why-should-you-care#:~:text=A%20shared%20dataset%20is%20a%20dataset%20that%20shared%20between%20multiple,tenant%20in%20Power%20BI%20environment.

Querying against imported data in Power BI

I’m pretty new to Power BI and am still at the point of assessing whether it will meet our needs.
I’ve got as far as realising that when creating a new report I can either Import tables (I’m using SQL Server) and use a Direct Query.
The particular report I’m trying to report is quite resource intensive. To create the report in TSql requires iterating through hundreds of thousands of rows in multiple tables in a cursor and then storing some data in a temp table which is the output of the query. I’m very concerned about using the Direct Query option for this because of potential performance degradation on the server.
Is it possible in Power BI Desktop to Import the 5 tables that are used in my query, and then somehow write my query against these tables? That way (in-theory) the query wouldn’t be sent directly to our server each time someone views the report.
My question is based on my lack of knowledge of Power Bi so I may be asking something that is completely impossible!!
Thanks in advance for your help
Regards
Dotdev
That's exactly what Import option does. It imports the tables only once (unless you refresh or change your query). The viewer would be looking at the data that was extracted upon import and packaged into the PBIX file rather than a direct connection to the database.