Navision DB/Company backup and restore using SQL/.NET - microsoft-dynamics

In my job, I have to do a lot of backup and restore of NAV companies in order to create new companies similar to previous company. I am planning to build a .net application to do the job. Basically automate the repetitive stuff, but the problem is the Navision we use is 2009 R2 and I can't find a way to backup and restore a NAV database/company in 2009 R2 using .Net/SQL. Is there any way do this?

As said, there is no way to automate it using script. When performing backup/restore Nav doing many things aside from just create another table set. It creates keys/views, append records to system tables like Company (where list of companies is stored).
From your question I can't understand why do you need backup company to create similar one. Because after that you would have to clear all ledgers etc. Why copy data just to wipe it after all?
Alternative approach you could use to solve problem of creating new company fast is to create a codeunit in Nav that will populate newly created company with all data you need. Take a look at codeunit 2 Company-Initialize. When run it creates empty records in all important setup tables and fills report selection. You can modify it or create similar one that will fill setup tables by your default values or copy them from another company you provide as parameter (use changecompany for that).
Here is one more thing that I've found:
In earlier versions of Microsoft Dynamics NAV, you could create a
table by using the INSERT Function (Record) to add a record to table
2000000006, the Company table. In Microsoft Dynamics NAV 2013, it is
not supported to create a company by using the INSERT function. You
must create companies by using the New Company window in the
development environment.
That means that in your version you can even create new Company automaticaly from codeunit I've mentioned.
Also since Nav 2013 R2 there is new capabilities. You can use comandline parameters of finsql.exe to create company (or). And then invoke Nav codeunit from PowerShell script to populate it with data.

there is no way to backup a NAV Company using SQL.
You can backup the whole database only.
If you want to backup a separate company you need to use the built in backup using fbk files (Tools -> Backup)
From NAV 2015 you can backup\restore companies from the RoleTailored\Windows Client.
Cheers!

Related

Power BI Embedded Approach for 100s of SQL Targets

I'm trying to find the best approach to delivering a BI solution to 400+ customers which each have their own database.
I've got PowerBI Embedded working using service principal licensing and I have the PowerBI service connected to my data through the On Premise Data Gateway.
I've build my first report pointing to 1 of the customer databases. Which works lovely.
What I want to do next, when embedding the report, is to tell PowerBI, for this session, to get the database from a different database.
I'm struggling to find somewhere where this is explained, or to understand if this is even possible.
I'm trying to avoid creating 400+ WorkSpaces or 400+ Data Sets.
If someone could point me in the right direction, it would be appreciated.
You can configure the report to use parameters and these parameters can be used to configure the source for your dataset:
https://www.phdata.io/blog/how-to-parameterize-data-sources-power-bi/
These parameters can be set by the app hosting the embedded report:
https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/update-parameters-in-group
Because the app is setting the parameter, each user will only see their own data. Since this will be a live connection, you would need to think about how the underlying server can support the workload.
An alternative solution would be to consolidate the customer databases into a single database (just the relevant tables) and use row level security to restrict access for each customer. The advantage to this design is that you take the burden off of the underlying SQL instance and push it into a PBI dataset that is made to handle huge datasets with sub-second response times.
More on that here: https://learn.microsoft.com/en-us/power-bi/enterprise/service-admin-rls

Reuse a previously published datasource in a Power BI report

I have developed a Power BI report using Power BI Desktop, pointing to a private on premise development database as the datasource so that I was able to develop and test it easily. Then, I published it from my Power BI Desktop pbix to the work area of my customer.
As a result, the work area contains the published report and the dataset. Later, my customer has changed the dataset so that it now points to the correct on premise production database of their own. It works perfectly.
Now, I want to publish a new report for my customer using the previously published and reconfigured dataset. The problem is that I can't see any option in Power BI Desktop to have the report point to the published dataset, nor I can't see any option to avoid creating a new dataset each time I publish a report, nor any way to reconfigure from the web portal the new published report to point to the same dataset as the first one.
Is there any way to do this or any work around for this scenario? I think the most reasonable solution would be to be able to change the dataset of any report, so that the datasets of any report could be interchangeable.
Update:
I had already used connection specific parameters, but I'm not given rights to change the published dataset, so thats a dead end.
Another thing I have come up to is that in Power BI Desktop you cannot change the connection parameters values to those of production enviroment and publish the report if you can't access the target database from your computer, because PowerBI Desktop ask you to apply changes first, and when it tries to apply the values it tries to connect to the corresponding database and, obviously, ends with a network related error or timeout error trying to connect to the database server, therefore cancelling changes and returning to the starting point.
It's always a good practice to use connection specific parameters to define the data source. This means that you do not enter server name directly, but specify it indirectly using a parameter. The same for the database name, if applicable.
If you are about to make a new report, cancel Get data dialog, define parameters as described bellow, and then in Get data specify the datasource using these parameters:
To modify an existing report, open Power Query Editor by clicking Edit Queries and in Manage Parameters define two new text parameters, lets name them ServerName and DatabaseName:
Set their current values to point to one of your data sources, e.g. SQLSERVER2016 and AdventureWorks2016. Then right click your query in the report and open Advanced Editor. Find the server name and database name in the M code:
and replace them with the parameters defined above, so the M code will look like this:
Now you can close and apply changes and your report should work as before. But now when you want to change the data source, do it using Edit Parameters:
and change the server and/or database name to point to the other data source, that you want to use for your report:
After changing parameter values, Power BI Desktop will ask you to apply the changes and reload the data from the new data source. To change the parameter values (i.e. the data source) of a report published in Power BI Service, go to dataset's settings and enter new server and/or database name:
If the server is on-premise, check the Gateway connection too, to make sure that it is configured properly to use the right gateway. You may also want to check the available gateways in Manage gateways:
After changing the data source, refresh your dataset to get the data from the new data source. With Power BI Pro account you can do this 8 times per 24 hours, while if the dataset is in a dedicated capacity, this limit is raised to 48 times per 24 hours.
This is a easy way to make your reports "switchable", e.g. for switching one report from DEV or QA to PROD environment, or as part of your disaster recovery plan, to automate switching all reports in some workgroup to another DR server. In your case, this will allow you (or your customers) to easily switch the datasource of the report.
I think the only correct answer is that it cannot be done, at least at this moment.
The most closest way of achieving this is with Live connections:
https://learn.microsoft.com/en-us/power-bi/desktop-report-lifecycle-datasets
But if you have already designed your report without using the Live connection but your own development enviroment and corresponding connection parameters then you are lost, your only chance is redo all your report with the Live Connection, or the queerest one solution, to use an alias in your configuration matching the name of the database server and the same database name that in the target production environment.

NAV 2009 PO Integration

I have no experience with NAV.
I have to move Purchase Order data from one system, to NAV2009. I worked on the ETL part, and I have all the data to be moved to NAV ready. How do I import it to NAV?
This depends, of course, on where your data is stored and whether your license allows to write code / create new objects in NAV. But in any case, there are at least two tables that must be filled - "Purchase Header" and "Purchase Line". Probably, some other tables (like Document Dimension) might be required - depends on the data you need to transfer. It is not recommended to insert records directly into corresponding SQL Server tables, since there is a lot of C/AL code in NAV order tables that validates the data, so C/AL triggers must be executed.
This still leaves several options.
Write C/AL code that would read data from external source and insert records into purchase header and purchase line. This assumes that you have appropriate license permissions and some experience with NAV dev environment.
Create an XMLPort object if the data can be stored in XML format. Or a Dataport for csv-like files - this object is still available in 2009. Can be restricted by the license too. About NAV XMLPort objects and NAV Dataport objects
Publish purchase order pages as web services and insert records from a consuming application. Registering and consuming a web service in NAV 2009

Inserting records in CRM 2011 using SSIS

I am working on a SSIS (2012) package that collects data from our till system to staging area and from staging area to CRM 2011 (on-premise | Roll up 11).
In CRM we have contact entity and order entity. These two entity are related via a guid called contactid(PK in contact) and customerid(FK in order).
when i insert new order in to CRM how do I ensure that the guid is created to associate that order to either a new contact or already existing contact?
I'm assuming since you're using SSIS your doing straight SQL inserts? If so, this is not supported. Ideally you'd be using the SDK, and in that case, you can set the GUID manually before actually creating the record, although the Contact Id still has to exist when creating the Order.
So you'll want to grab all of your existing Contacts up front, then determine for each order, if the contact exists or not. If it does, just set the customerId when you create the order and you're all set. If it doesn't, you'll need to create the Contact (potentially assigning it an Id), and then setting the customerId when you create the order.
I would echo what Daryl has said in that SQL inserts are not supported and generally a bad idea. However there is a solution, a company called Kingsway Soft make a SSIS component that allows you to read and write into CRM using the web services. The best part of it is that it is free if you don't want to run it using SQL agent. Even if you do want to schedule it the cost is very small for such an excellent product.
You can download it from here
http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm

Sharepoint 2010: best practice to migrate legacy data to sharepoint list

I have to migrate some legacy data from stand-alone sql server database to sharepoint list.
I'm going to use programmatic approach and write a code that communicates with sharepoint list asmx web service.
Are there some "data transformation wizards" to simplify such a task or a better approach to port legacy data from sql server database to sharepoint list?
Thank you in advance!
Being one time operation, I would not worrry about Best Practice but would consider what's the fastest way to do it.
You can use Excel 2010 (I have not tested it with Excel 2007) export data to Sharepoint 2010. Here are the high level steps:
Import data from SQL Server using DATA Tab in the ribbon
Excel would automatically create a TABLE
Now you can prepare the data for Export to Sharepoint. Here, you can remove unwanted columns, add new columns remove unwanted rows, arrange columns etc.
While being in the Table, access the "Export Table To Sharepoint List" functionality to publish you data to Sharepoint. More information about this is available at: http://office.microsoft.com/en-gb/excel-help/export-an-excel-table-to-a-sharepoint-list-HA010131472.aspx
It is quick! but let;s be aware of the limitations:
1. It cannot publish data to a list which already exists
2. It will not create a content type for the exported list. The columns are directly attached to the list.
If you want greater control over the migration, programming may be the way to go unless someone has a better idea in this great forum!