Sharepoint 2010: best practice to migrate legacy data to sharepoint list - list

I have to migrate some legacy data from stand-alone sql server database to sharepoint list.
I'm going to use programmatic approach and write a code that communicates with sharepoint list asmx web service.
Are there some "data transformation wizards" to simplify such a task or a better approach to port legacy data from sql server database to sharepoint list?
Thank you in advance!

Being one time operation, I would not worrry about Best Practice but would consider what's the fastest way to do it.
You can use Excel 2010 (I have not tested it with Excel 2007) export data to Sharepoint 2010. Here are the high level steps:
Import data from SQL Server using DATA Tab in the ribbon
Excel would automatically create a TABLE
Now you can prepare the data for Export to Sharepoint. Here, you can remove unwanted columns, add new columns remove unwanted rows, arrange columns etc.
While being in the Table, access the "Export Table To Sharepoint List" functionality to publish you data to Sharepoint. More information about this is available at: http://office.microsoft.com/en-gb/excel-help/export-an-excel-table-to-a-sharepoint-list-HA010131472.aspx
It is quick! but let;s be aware of the limitations:
1. It cannot publish data to a list which already exists
2. It will not create a content type for the exported list. The columns are directly attached to the list.
If you want greater control over the migration, programming may be the way to go unless someone has a better idea in this great forum!

Related

Power BI Embedded Approach for 100s of SQL Targets

I'm trying to find the best approach to delivering a BI solution to 400+ customers which each have their own database.
I've got PowerBI Embedded working using service principal licensing and I have the PowerBI service connected to my data through the On Premise Data Gateway.
I've build my first report pointing to 1 of the customer databases. Which works lovely.
What I want to do next, when embedding the report, is to tell PowerBI, for this session, to get the database from a different database.
I'm struggling to find somewhere where this is explained, or to understand if this is even possible.
I'm trying to avoid creating 400+ WorkSpaces or 400+ Data Sets.
If someone could point me in the right direction, it would be appreciated.
You can configure the report to use parameters and these parameters can be used to configure the source for your dataset:
https://www.phdata.io/blog/how-to-parameterize-data-sources-power-bi/
These parameters can be set by the app hosting the embedded report:
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-parameters-in-group
Because the app is setting the parameter, each user will only see their own data. Since this will be a live connection, you would need to think about how the underlying server can support the workload.
An alternative solution would be to consolidate the customer databases into a single database (just the relevant tables) and use row level security to restrict access for each customer. The advantage to this design is that you take the burden off of the underlying SQL instance and push it into a PBI dataset that is made to handle huge datasets with sub-second response times.
More on that here: https://learn.microsoft.com/en-us/power-bi/enterprise/service-admin-rls

PowerBI on demand report/dataset generation

Perhaps better to rephrase my question.
If I have a powerbi report..
Can I somehow make copies of this report with different data(set?) via an API and on demand?
e.g.
User requests a report with some parameters,
Existing report gets copies with a new dataset and published
tnx
Brian
== origional question below ==
Would anyone be able to point in in the right direction here.
I currently have a c# application that generates reports which are simply excel files.
Each time I create a report I simply open an excel file with some preexisting formatting, set the data in a particular sheet then save it to a new location.
How can I achieve something similar with PowerBI?
Ideaily I'd like to open a precanned powerbi report, pass a parameter to the 'dataset' I've just generated,
I dont have any string feeling as to what the dataset would be, I'm open to whatever works,
database/static dataset created by csharp somehow/webservice/etc.
Can anyone suggest how? tnx
How about creating you Power BI report based on your existing Excel files? Make sure to store them on SharePoint so that the PowerBI Online Service can pick them up w/o additional gateways. You can then use your app to trigger the report refresh as soon as the Excel files have been updated.

Power Automate: Sharepoint List Export to Sql Server

I need to do a one time load of all the rows from a Sharepoint list into Sql Server. I am having trouble with parsing through the Get Items control. Especially for fields that are arrays.
Has anybody tried this before?
Thanks
Barry

Navision DB/Company backup and restore using SQL/.NET

In my job, I have to do a lot of backup and restore of NAV companies in order to create new companies similar to previous company. I am planning to build a .net application to do the job. Basically automate the repetitive stuff, but the problem is the Navision we use is 2009 R2 and I can't find a way to backup and restore a NAV database/company in 2009 R2 using .Net/SQL. Is there any way do this?
As said, there is no way to automate it using script. When performing backup/restore Nav doing many things aside from just create another table set. It creates keys/views, append records to system tables like Company (where list of companies is stored).
From your question I can't understand why do you need backup company to create similar one. Because after that you would have to clear all ledgers etc. Why copy data just to wipe it after all?
Alternative approach you could use to solve problem of creating new company fast is to create a codeunit in Nav that will populate newly created company with all data you need. Take a look at codeunit 2 Company-Initialize. When run it creates empty records in all important setup tables and fills report selection. You can modify it or create similar one that will fill setup tables by your default values or copy them from another company you provide as parameter (use changecompany for that).
Here is one more thing that I've found:
In earlier versions of Microsoft Dynamics NAV, you could create a
table by using the INSERT Function (Record) to add a record to table
2000000006, the Company table. In Microsoft Dynamics NAV 2013, it is
not supported to create a company by using the INSERT function. You
must create companies by using the New Company window in the
development environment.
That means that in your version you can even create new Company automaticaly from codeunit I've mentioned.
Also since Nav 2013 R2 there is new capabilities. You can use comandline parameters of finsql.exe to create company (or). And then invoke Nav codeunit from PowerShell script to populate it with data.
there is no way to backup a NAV Company using SQL.
You can backup the whole database only.
If you want to backup a separate company you need to use the built in backup using fbk files (Tools -> Backup)
From NAV 2015 you can backup\restore companies from the RoleTailored\Windows Client.
Cheers!

Anyone using a web service as a data source in Excel 2007?

Can I use a web service as a data source for creating Excel pivot tables?
Currently, the soure data for the pivot table is being exported from our SQL db to a CSV file. Then, the CSV file is loaded into a worksheet. From there, a pivot table is created in the same workbook.
Customers login to a website, click some links, and an excel file (with data and pivot table) is generated. This is a public app so the preference is to not connect directly to the DB.
We control the database and generate the output. We are looking to streamline this process. The SQL db and pivot tables can not / will not change.
See http://www.vertex42.com/News/excel-web-query.html
What format does the "public-facing website" use in making the data available? A data file, a table on a web page? This issue will determine how much of a scraping operation you'll need to do.
You'll still need to write the web service and have it run on a server. A possible alternative is to use Yahoo Pipes to do the conversions for you.