How to modify a scope in Sync Framework? - microsoft-sync-framework

I am new to using sync framework and need help in fixing an issue.
The system we built is a window based application. Each user will have their own database in their local. End of the day they sync their database to the remote DB server when they are within the network.
I added two new columns to an existing table. Scope definition seems to be updated in my local database. But when I try to do a sync with my remote DB server it says could not find _bulk-insert store procedure and errors out.
When checked in my remote DB server. I could see the new columns in the table and I don't see any of the store procedures. Scope-config table does not have the new columns in it.
Do the remote server needs to have the store procedure or updating the scope config table will do?

have you provisioned your remote DB server? if you're not finding the sync fx related objects then its not provisioned.
likewise, Sync Fx do not support schema synchronisation. there's nothing on the API to allow you to alter the scope definition either.
it's either you drop and re-create the scope and re-sync, or you hack your way into the Sync Fx scope definition metadata.

Related

How to capture session level details in a table

I have a requirement to capture session level details like session start time, end time, src success row, failedrows etc.. in a audit table. As all those details are available in prebuilt session variables i need to store these in a table. As of now what i am doing is, taking an assignment task in a workflow and assigning all these prebuilt session variables values for a particular session to wrkflow variables and passing these workflow variables to mapping variables using another non reusable session (the mapping which loads the table) using pre variable assignment option.It is working fine for workflow which is having one session. But if i have to implement this for a workflow having more no of sessions this process will be tedious as i have to create assignment task for each of these sessions and need to create non resuable session which calls a mapping to load into audit table.
So i am wondering is there any alternative solution to get this job done? I am thinking of a solution in which if we can able to captures audit details of all session in a file and pass this file as a input to a mapping to load this data at once into table. Is this possible? any solution?
Check this out: ETL Operational framework
It covers and end-to-end solution that should fit your needs and be quite easy to extend if you have multiple sessions - all you'd need to do is apply similar Post session commands before running the final session that loads the stats to database.

Push from one sql server to another autonomously

I have an application that requires me to pull certain information from DB#1 and push it to DB#2 every time a certain entry in a table from DB#1 is updated. The polling rate doesn't need to be extremely fast, but it probably shouldn't be any slower than 1 second.
I was planning on writing a small service using the C++ Connector library, but I am worried about putting too much load on DB#1. Is there a more efficient way of doing this? Such as built in functionality within an SQL script?
There are many methods to accomplish this, so it may be other factors you prefer that drive the approach.
If the SQL Server databases are on the same server instance:
Trigger on the DB1 tables that push to the DB2 tables
Stored procedure (in DB1 or DB2) that uses MERGE to identify changes and sync them to DB2, then use SQL job to call the procedure on your schedule
Enable Change Tracking on database and desired tables, then use stored proc + SQL job to send changes without any queries on source tables
If on different instances or servers (can also work if on same instance though):
SSIS Package to identify changes and push to DB2 (bonus can work with change data capture)
Merge Replication to synchronize changes
AlwaysOn Availability Groups to synchronize entire dbs
Microsoft Sync Framework
Knowing nothing about your preferences or comfort levels, I would probably start with Merge Replication - can be a bit tricky and tedious to setup, but performs very well.
You can create a trigger in DB1 and dblinks in between DB1 and DB2. So you can natively invoke trigger within DB1 and transfer data directly to DB2.

Sync Framework download with filter and but upload all

I want to synchronize data between central SQL server database and SQL Server Local DB. Most tables are download and upload without filter. One table should be downloaded by using a filter, but should be uploaded without filter. Are there any sample code for this situation? should I provision two different scopes in both server database and Local DB?
just create separate scopes and you should be fine.
1 - Filtered Download Scope
2 - Unfiltered Upload Scope

Microsoft Sync Framework and Aggregate Functions

I'm currently trying to implement Microsoft Sync Framework for field agents that will be working mostly disconnected from the server.
Currently I have a SQL Express database the application points to for offline mode and when they are back online, They can hit a sync button to push the changes up and down.
I have no problems creating the filtered scope, But our schema uses a "VersionID" column to handle historical data.
No data is deleted from the databases, so when a row is "updated" a new row is inserted with max(VersionID) + 1 as its new versionID.
Since I can't use aggregate functions in a filtered scope, I can't figure out how to retrieve the max version only for each unique row.
I only need to retrieve the max(versionID) record because of the 10GB limit for the database, I can't possibly download all records without going over the limit with all the support tables the application requires.
Any ideas?
the scope filter is simply appended to the _selectchanges SP's WHERE clause. If you can put your condition in a simple query, you should be able to set the same as the scope filter.

Making database schema changes using Microsoft Sync framework without losing any tracking table data

I am using Microsoft Synch Service Framework 4.0 for synching Sql server Database tables with SqlLite Database on the Ipad side.
Before making any Database schema changes in the Sql Server Database, We have to Deprovision the database tables. ALso after making the schema changes, we ReProvision the tables.
Now in this process, the tracking tables( i.e. the Synching information) gets deleted.
I want the tracking table information to be restored after Reprovisioning.
How can this be done? Is it possible to make DB changes without Deprovisioning.
e.g, the application is in Version 2.0, The synching is working fine. Now in the next version 3.0, i want to make some DB changes. SO, in the process of Deprovisioning-Provisioning, the tracking info. gets deleted. So all the tracking information from the previous version is lost. I do not want to loose the tracking info. How can i restore this tracking information from the previous version.
I believe we will have to write a custom code or trigger to store the tracking information before Deprovisioning. Could anyone suggest a suitable method OR provide some useful links regarding this issue.
the provisioning process should automatically populate the tracking table for you. you don't have to copy and reload them yourself.
now if you think the tracking table is where the framework stores what was previously synched, the answer is no.
the tracking table simply stores what was inserted/updated/deleted. it's used for change enumeration. the information on what was previously synched is stored in the scope_info table.
when you deprovision, you wipe out this sync metadata. when you synch, its like the two replicas has never synched before. thus you will encounter conflicts as the framework tries to apply rows that already exists on the destination.
you can find information here on how to "hack" the sync fx created objects to effect some types of schema changes.
Modifying Sync Framework Scope Definition – Part 1 – Introduction
Modifying Sync Framework Scope Definition – Part 2 – Workarounds
Modifying Sync Framework Scope Definition – Part 3 – Workarounds – Adding/Removing Columns
Modifying Sync Framework Scope Definition – Part 4 – Workarounds – Adding a Table to an existing scope
Lets say I have one table "User" that I want to synch.
A tracking table will be created "User_tracking" and some synch information will be present in it after synching.
WHen I make any DB changes, this Tracking table "User_tracking" will be deleted AND the tracking info. will be lost during the Deprovisioning- Provisioning process.
My workaround:
Before Deprovisioning, I will write a script to copy all the "User_tracking" data into another temporary table "User_tracking_1". so all the existing tracking info will be stored in "User_tracking_1". WHen I reprovision the table, a new trackin table "User_Tracking" will be created.
After Reprovisioning, I will copy the data from table "User_tracking_1" to "User_Tracking" and then delete the contents from table "User_Tracking_1".
UserTracking info will be restored.
Is this the right approach...