Querying against imported data in Power BI - powerbi

I’m pretty new to Power BI and am still at the point of assessing whether it will meet our needs.
I’ve got as far as realising that when creating a new report I can either Import tables (I’m using SQL Server) and use a Direct Query.
The particular report I’m trying to report is quite resource intensive. To create the report in TSql requires iterating through hundreds of thousands of rows in multiple tables in a cursor and then storing some data in a temp table which is the output of the query. I’m very concerned about using the Direct Query option for this because of potential performance degradation on the server.
Is it possible in Power BI Desktop to Import the 5 tables that are used in my query, and then somehow write my query against these tables? That way (in-theory) the query wouldn’t be sent directly to our server each time someone views the report.
My question is based on my lack of knowledge of Power Bi so I may be asking something that is completely impossible!!
Thanks in advance for your help
Regards
Dotdev

That's exactly what Import option does. It imports the tables only once (unless you refresh or change your query). The viewer would be looking at the data that was extracted upon import and packaged into the PBIX file rather than a direct connection to the database.

Related

Power BI develop query with cached dataset

When developing a query in Power BI with a database data source, making any changes causes the query editor to 'start from scratch' and re-query the database.
Wondering if there is a workaround that allows you to develop a query without repeated long wait times by eg downloading a temporary local flat file of the full dataset which can be used to develop the query offline and can then be swapped out for the live database connection when you are happy with it.
Importing the data once, exporting as a csv from a Power BI table visualisation and re-importing as a new data source would work but maybe there's a simpler way?
Thanks
There's two approaches you can use.
If your database supports query folding, make the first step take just the top 200 records whilst you develop your query. Once your happy with it, remove the firstN filter.
Load the entire table to the model, export it to a csv using DAX studio, develop your query using the CSV and then switch back to the DB once you're happy with it.

Are columns deleted in import mode still getting refresh from data? (Power BI)

There is a dataset that I have established with a connection with ODBC. I'm transferring this dataset to the desktop with the import mode. But since I won't be using most of the columns, I deleted the ones I didn't use in the power query editor. Will those I delete still be updated at scheduled refresh times? So can I see any performance improvement for deleting these columns?
My opinion on this is that the power bi still keeps the data in some place and refreshes it. Because I can undo the operations I have done from the power query. But I am not sure that is just my opinion. Can anyone provide any helpful links?

Power BI and Shared datasets how to allow users to create new measures and reports and publish

We are having difficulty finding a method of sharing a dataset and allowing users to use that dataset to create and publish their own reports. This would include ability to create new measures (Dax) and then publish themselves. Using the "service" live connection does not seem to allow that and if not using that there seems to be an issue of refreshing the data once that dataset is downloaded and modified with new columns/measures etc. 
Greatly appreciate any help on this. So far I have seen nothing that shows how to do any of this so I have to assume it may not be possible? Thank you. 
Live Connect to a Power BI Dataset allows for local measures.
If you need more modeling changes when working with a remote Data Set, the DirectQuery for Power BI Datasets and AAS feature (currently in preview) enables you to mash-up remote Data Set tables, with local tables, and allows for adding calculated columns to remote tables.
But you should use this with some care, as the query processing is split between the local model and the remote model(s), which can cause performance issues.

How to model queries in Power BI for daily append of new data instead of overwriting

I'm trying to build a simple report in Power BI based upon data published on a website.
Here is what I want to achieve
This website publishes data for COVID cases in the country.
The number are just the current numbers, without any time-series.
I want to fetch these numbers from this website daily and build a report on
top of it (with time series kind of analysis).
So I fetch these numbers (Get Data > Web > URL) and get this into a query I then add
a custom column with a timestmap (M's DateTime.LocalNow() function)
and get this data with the required timestamp.
Now I want to refresh this query daily, so that I get daily results in this query.
6. As expected, PBI simply overwrites the existing rows with new data,
with the latest timestamp (my custom column).
I tried few things like:-
Creating a new query and appending data to it, it doesn't seem to work, existing data gets over-written (maybe the way I have created the new query).
Explored incremental refresh functionality, it doesn't seem to fit my use case.
Tried looking at other similar posts, none seem to help me resolve this.
Questions:-
Is there a simple workaround to circumvent this (point#7) and have PBI append new data instead of overwriting existing data.
Am i correct on point#2 above (incremental refresh)?
Appreciate any pointers. Thanks in advance!
There is no simple workaround within Power BI.
Power BI is not designed to be used as a database where you store historical data. It's designed to connect to data and create reports from that, so you'll need to store the daily data somewhere external.
There are tons of ways to store the data. E.g., you could save them as CSVs in a folder that Power BI loads from or you could write them to a database table and connect to that.
Edit: That said, there is a non-simple workaround if this is something you really must do.
Though not recommended, you can use incremental refresh to trick Power BI into doing what you want.

Power BI and Azure Document DB

I have connected to Azure Document DB by Power BI ,but it is taking too much time for data to load and even more time to apply the queries ...Is there any way to reduce this data loading time??
This is more of a generic question around Power BI and even broadly BI tools. But, in general, you have to specify more filters to the queries (yes, they can be edited even in plain SQL). Azure Cosmos DB is super-fast, it all depends on how much data you're trying to query. Also, make sure data is in the region from which users are accessing it in Power BI.