I am using a Sharepoint onlne list as a Source for Power BI. There are 67 sharepoint online lists , whom i have to append together, and then map it to an excel in my Local Disk.
I have given the Table.Buffer command which i helps to load the table faster. But still it takes long time for the table to load.
Can anyone help me with this Issue or provide an alternative
Unfortunately, the way Excel loads the data you cannot really make it faster. You can try disabling "Background Data" (go to Query Options > Data Load > deselect 'Allow data preview to download in the background') option but that is limited performance enhancement. Here are some articles on the subject:
https://www.myonlinetraininghub.com/excel-forum/power-query/any-way-to-speed-up-really-slow-refresh-times-in-power-query
https://powerpivotpro.com/2017/07/power-query-refresh-speeds-suck/
But there is a better way!!! You can connect your PowerBI directly to the Sharepoint lists without the Excel step and it will be much faster!
Related
Perhaps better to rephrase my question.
If I have a powerbi report..
Can I somehow make copies of this report with different data(set?) via an API and on demand?
e.g.
User requests a report with some parameters,
Existing report gets copies with a new dataset and published
tnx
Brian
== origional question below ==
Would anyone be able to point in in the right direction here.
I currently have a c# application that generates reports which are simply excel files.
Each time I create a report I simply open an excel file with some preexisting formatting, set the data in a particular sheet then save it to a new location.
How can I achieve something similar with PowerBI?
Ideaily I'd like to open a precanned powerbi report, pass a parameter to the 'dataset' I've just generated,
I dont have any string feeling as to what the dataset would be, I'm open to whatever works,
database/static dataset created by csharp somehow/webservice/etc.
Can anyone suggest how? tnx
How about creating you Power BI report based on your existing Excel files? Make sure to store them on SharePoint so that the PowerBI Online Service can pick them up w/o additional gateways. You can then use your app to trigger the report refresh as soon as the Excel files have been updated.
When developing a query in Power BI with a database data source, making any changes causes the query editor to 'start from scratch' and re-query the database.
Wondering if there is a workaround that allows you to develop a query without repeated long wait times by eg downloading a temporary local flat file of the full dataset which can be used to develop the query offline and can then be swapped out for the live database connection when you are happy with it.
Importing the data once, exporting as a csv from a Power BI table visualisation and re-importing as a new data source would work but maybe there's a simpler way?
Thanks
There's two approaches you can use.
If your database supports query folding, make the first step take just the top 200 records whilst you develop your query. Once your happy with it, remove the firstN filter.
Load the entire table to the model, export it to a csv using DAX studio, develop your query using the CSV and then switch back to the DB once you're happy with it.
I have an Existing report in Power BI with Oracle Data Source from which I had directly access the Table.
But now I need to Change the Data Source of the Report using SSAS.
By Using SSAS, we can access the oracle Tables and deploy the SSAS in Azure Service.
Now through this Azure Service we can access the Data.
My Question is Without any Changes in the Report, Can we Simply Change the DataSource? Because I'm Showing Lot of Graphs in that Report.
Since I'm New to Power BI , Is this possible?
There will likely be an unpredictably large amount of other issues (character encoding, date formatting, etc.) that you'll have to work through.
The way I might approach this problem is, if the tables are exactly identical, or you didn't make any changes within power query (like removing columns, merging tables, etc.) you may be able to modify the M code within the advanced editor, and try to swap the data sources to see if it works.
Go into "Transform Data"
Select the table you want to modify
Click the "Advanced Editor" icon in the ribbon.
Here, I imported an excel file, but for you, it should show some kind of "AnalysisServices" line. I don't have an SSAS database to connect to so I can't validate. Try replacing this line with the connection string to your SSAS datasource and see if it works.
Save, apply update, and see if it works?
Again, this is not really advisable, but if you want to give it a try, and the data sources are identical, this is how I might approach the problem before just re-making the report whole-cloth.
For context, we would like to visualize our data in google data studio - this dataset receives more entries each week. I have tried hosting our data sets in google drive, but it seems that they're too large and this slows down google data studio (the file is only 50 mb, am I doing something wrong?).
I have loaded our data into google cloud storage --> google bigquery, and connected my google data studio to my bigquery table. This has allowed me to use the google data studio dashboard much quicker!
I'm not sure what is the best way to update our data weekly in google cloud/bigquery. I have found a slow way to do this by uploading the new weekly data to google cloud, then appending the data to my table manually in bigquery, but I'm wondering if there's a better way to do this (or at least a more automated way)?
I'm open to any suggestions, and if you think that bigquery/google cloud storage is not the answer for me, please let me know!
If I understand your question correctly, you want to automate the query that populate your table, which is connected to Data Studio.
If this is the case, then you can use Scheduled Query from BigQuery. Scheduled query allow you to define a query which results can be inserted in a new table. Particularly you can specify different rules for repetition (minimum each 15 minutes) and execution, as well as destination writing options (destination table, writing mode: append, truncate).
In order to use Scheduled Queries your account must have the right permissions. You can have a look at the following documentation to better understand how to use Scheduled Query [1].
Also, please note that at the front end the updated data in the BigQuery table will be seen updated in Datastudio at each refresh (click on refresh button in Datastudio). To automatically refresh the front-end visualization you can use the following plugin [2] or automate the click on the refresh button through Browser console commands.
[1] https://cloud.google.com/bigquery/docs/scheduling-queries
[2] https://chrome.google.com/webstore/detail/data-studio-auto-refresh/inkgahcdacjcejipadnndepfllmbgoag?hl=en
I have to migrate some legacy data from stand-alone sql server database to sharepoint list.
I'm going to use programmatic approach and write a code that communicates with sharepoint list asmx web service.
Are there some "data transformation wizards" to simplify such a task or a better approach to port legacy data from sql server database to sharepoint list?
Thank you in advance!
Being one time operation, I would not worrry about Best Practice but would consider what's the fastest way to do it.
You can use Excel 2010 (I have not tested it with Excel 2007) export data to Sharepoint 2010. Here are the high level steps:
Import data from SQL Server using DATA Tab in the ribbon
Excel would automatically create a TABLE
Now you can prepare the data for Export to Sharepoint. Here, you can remove unwanted columns, add new columns remove unwanted rows, arrange columns etc.
While being in the Table, access the "Export Table To Sharepoint List" functionality to publish you data to Sharepoint. More information about this is available at: http://office.microsoft.com/en-gb/excel-help/export-an-excel-table-to-a-sharepoint-list-HA010131472.aspx
It is quick! but let;s be aware of the limitations:
1. It cannot publish data to a list which already exists
2. It will not create a content type for the exported list. The columns are directly attached to the list.
If you want greater control over the migration, programming may be the way to go unless someone has a better idea in this great forum!