Can I use a web service as a data source for creating Excel pivot tables?
Currently, the soure data for the pivot table is being exported from our SQL db to a CSV file. Then, the CSV file is loaded into a worksheet. From there, a pivot table is created in the same workbook.
Customers login to a website, click some links, and an excel file (with data and pivot table) is generated. This is a public app so the preference is to not connect directly to the DB.
We control the database and generate the output. We are looking to streamline this process. The SQL db and pivot tables can not / will not change.
See http://www.vertex42.com/News/excel-web-query.html
What format does the "public-facing website" use in making the data available? A data file, a table on a web page? This issue will determine how much of a scraping operation you'll need to do.
You'll still need to write the web service and have it run on a server. A possible alternative is to use Yahoo Pipes to do the conversions for you.
Related
When developing a query in Power BI with a database data source, making any changes causes the query editor to 'start from scratch' and re-query the database.
Wondering if there is a workaround that allows you to develop a query without repeated long wait times by eg downloading a temporary local flat file of the full dataset which can be used to develop the query offline and can then be swapped out for the live database connection when you are happy with it.
Importing the data once, exporting as a csv from a Power BI table visualisation and re-importing as a new data source would work but maybe there's a simpler way?
Thanks
There's two approaches you can use.
If your database supports query folding, make the first step take just the top 200 records whilst you develop your query. Once your happy with it, remove the firstN filter.
Load the entire table to the model, export it to a csv using DAX studio, develop your query using the CSV and then switch back to the DB once you're happy with it.
I have an existing PowerBI report that imports data from an SQL Server analytics services database. This is working fine and I can schedule automatic refreshes using the Gateway provided by my organization.
I would now like to add some additional, but rarely changing data, that I only have in a local Excel file. When I do add this data, the report stops refreshing automatically and complains, that it has no gateway to refresh this Excel file.
What I would like is that Power BI is refreshing the data of the SQL Server analytics services database, but just keeps the existing Excel file without updating it. - I will upload an updated version of the PowerBI report if I need to change the data in the Excel file.
Is that possible? I couldn't find out how. I was trying to upload the Excel file to a different dataset to the Power BI service and reference this dataset in my report. Just to find out, that I cannot access a different Power BI dataset and SQL server analysis services database from the same report.
Three things I can think of
Upload the file to onedrive/sharepoint so that it's accessible online (per Dev's answer)
If the data is simple enough, you can add the data directly into PowerBI itself and skip the Excel file entirely.
You can disable the Excel file refresh so that PBI does not try and refresh(and thus access) the local Excel file. (Not sure if this will work)
I had a similar issue I came across. Yes, you can just use Enter Data to add a table, but you can only build something with less than 3000 cells, so you'd have to merge several tables if something was larger than that.
Turning off the report refresh in the suggestion above (#3) still requires a gateway, unfortunately.
I just created a dataflow and plopped the data from my csv there. You'll have to create a connection and refresh it, but you don't need to schedule a refresh there, so no need to create a gateway.
Then just link the dataflow as a source to your .pbix file and setup your gateway to point at the dataflow.
I have a problem:
I have a PBI file containing three data sources: 2 SQL Server sources + 1 API call.
I have separate queries for each respective data source and an additional query that combines all three queries into a single table.
Both SQL Server sources have been added to a gateway and I can set scheduled refresh for each source, if I publish them in separate PBI files.
However, I cannot set scheduled refresh for the file that contains all three sources - both the data source credentials and scheduled refresh options are greyed out.
The manage gateway section of the settings page also shows no gateway options. If I publish the SQL Server data (with no API data) I can clearly see my data source and gateway under the gateway heading.
screenshot of dataset settings
Does anyone have any idea why this might be happening?
Thank you,
I had the same problem.
I have a PBI file with different data sources : SQL Server sources and APIs.
On The PowerBI Service the Data Source Credentials was grayed out, so here's what I did :
Downloaded the file
Refresh the file locally and signed up on all data sources (the Server of DB Server name changed but not for the APIs)
published in the PBI Service
It worked for me.
Same problem here. After additional poking around I learned that the "Web Source" (API call) was the reason for the inability to refresh and can cause "Data Source Credentials" to be inaccessible. This was annoying to learn after diving down several rabbit holes.
Several (weak) workarounds
Using Excel's Power Query to connect to the web source. Learn more about Excel's Power Query.
Make any needed transformations.
Put the Excel file in SharePoint Online folder or other PBI accessible directory.
Connect to the Excel file using the appropriate data source (i.e. SharePoint Folder).
Alternatively, if the data is static, you can directly copy/paste values into PBI (if you just need to get this done and move on with your life):
Copy target values
Open Power Query Editor
Home tab -> Enter Data
Paste values into table
Hopefully this will save some poor soul a little of their life.
Problem Context
We have a set of excel reports which are generated from an excel input provided by the user and then fed into SAS for further transformation. SAS pulls data from Teradata database and then there is a lot of manipulation that happens with the input data & data pulled from Teradata. Finally, a dataset is generated which can either be sent to the client as a report, or be used for populating Tableau dashboard. Also the database is being migrated from Teradata to Google Cloud (Big Query EDW) as the Teradata pulls from SAS used to take almost 6-7 hours
Problem Statement
Now we need to automate this whole process, by creating front end for the user to upload the input files and from there on the process should trigger and in the end the user should receive the excel file or Tableau dashboard as an attachment in a mail.
Can you suggest what technologies should be used in the front end & middle tier to make this process feasible is least possible time with google cloud platform as the backend?
Can an R shiny front end be a solution given that we need to communicate with a Google Cloud backend ?
I have got suggestion from people that Django will be a good framework to accomplish this task. What are your views on this ?
I have to migrate some legacy data from stand-alone sql server database to sharepoint list.
I'm going to use programmatic approach and write a code that communicates with sharepoint list asmx web service.
Are there some "data transformation wizards" to simplify such a task or a better approach to port legacy data from sql server database to sharepoint list?
Thank you in advance!
Being one time operation, I would not worrry about Best Practice but would consider what's the fastest way to do it.
You can use Excel 2010 (I have not tested it with Excel 2007) export data to Sharepoint 2010. Here are the high level steps:
Import data from SQL Server using DATA Tab in the ribbon
Excel would automatically create a TABLE
Now you can prepare the data for Export to Sharepoint. Here, you can remove unwanted columns, add new columns remove unwanted rows, arrange columns etc.
While being in the Table, access the "Export Table To Sharepoint List" functionality to publish you data to Sharepoint. More information about this is available at: http://office.microsoft.com/en-gb/excel-help/export-an-excel-table-to-a-sharepoint-list-HA010131472.aspx
It is quick! but let;s be aware of the limitations:
1. It cannot publish data to a list which already exists
2. It will not create a content type for the exported list. The columns are directly attached to the list.
If you want greater control over the migration, programming may be the way to go unless someone has a better idea in this great forum!