SyncFx SQL Knowledge RRS feed

  • Question

  • I'm not sure this is the best place to ask this question ... I know there are SyncFx boards but they are not very active, and whenever I've called I have always ended up in SQL Support.

    I am looking for some help with an unusual requirement working with the SyncFx library and SQL Server. 

    Short Overview:

    We have a long-established enterprise platform that uses SyncFx between one central application and several satellite applications.  Each application (central or satellite) has a local Sql Server instance in which data is stored.  I should add that this application is sold to customers, so we have in fact many instances of this “1 central, many satellite” deployments.

    As is normal with SyncFx, we’ve had to de-provision and re-provision that data any time we release changes that have significant database changes.  The next sync that is performed takes an exceedingly long period of time as each row is sent from the central server to the satellite, and from the satellite server to the central server, to re-initialize the tracking databases and (more to the point of this email) the knowledge stored in the scope_info tables.  This process takes so long that customers have begun balking at the upgrade process.  To accommodate them, we’ve written our own code to minimize the need to fully re-provision the databases.  This is working well, but along comes the need to change the definition of a scope. 

    Ultimately what I would like to do to avoid this in the future is break up all of our current scopes in to single-table scopes.  I already have the ability to drive them in a way that reduces dependent conflicts, and automatically resolve others.

    But I can’t impose another lengthy upgrade to get to single-table scopes.  And I don’t think I should need to.  This gets to the …

    Problem Description:

    To remedy this, I think I should be able to take an existing scope that has, say, 10 tables and create 10 separate scopes, one per table.  I should be able to take the knowledge stored in the original scope’s scope_info row and create knowledge for each of the 10 new tables from it.  I see how to use SyncKnowledge.Deserialize to get a SyncKnowledge instance, and have written a test to prove it, but there is nothing available to me to explain how I might derive each table’s separate knowledge from that original scope’s knowledge.  So my request is for some direction on how this should be done.  I do understand that I am going in uncharted waters….


    • Moved by Olaf HelperMVP Friday, January 9, 2015 8:39 AM Moved from "Database Engine" to a more related forum
    Thursday, January 8, 2015 7:37 PM

All replies

  • Hi khauser24,

    As the issue is related to Sync Framework, for quick and accurate response, I would like to recommend you post the question in the Sync Framework forums at . It is appropriate and more experts will assist you.

    Lydia Zhang

    If you have any feedback on our support, please click here.

    Lydia Zhang
    TechNet Community Support

    Friday, January 9, 2015 3:11 AM
  • Hi khauser24,

    As the issue is related to Sync Framework, for quick and accurate response, I would like to recommend you post the question in the Sync Framework forums at . It is appropriate and more experts will assist you.

    Lydia Zhang

    If you have any feedback on our support, please click here.

    Lydia Zhang
    TechNet Community Support

    The reason I chose the SQL community is, quite simply, that according to, the SyncFx forums are no longer monitored for priority support.  And in fact there is NO traffic there.  So can you tell me that indeed this will be answered by Microsoft?  Or should I cut my losses and create a support incident?
    Friday, January 9, 2015 6:16 PM
  • in order to decipher the sync knowledge for an existing scope, you will have to deal with the replica-key mapping as well. 

    you can have a look at merging the sync knowledge SyncKnowledge.Combine.

    Or if you know the client's existing scope is up to date, you can simulate a sync on the new scopes, and in the ChangesSelected event, remove all rows so nothing gets applied. When it finishes the sync, it will write the new sync knowledge to the new scope thinking it actually synched.

    Monday, January 12, 2015 1:23 AM
  • Hi June!

    I am happy to see you are still active here.  In my opinion Microsoft should be paying you ... you more than anyone else have fulfilled the support needs of this product!

    OK, enough of the buttering up ;)

    If I understand your reply, it would seem to me that if I start with a scope that has 10 different tables in it and separate each table to its own scope, I could see that new scope's knowledge with the entirety of the knowledge from the original scope.  Then the next time that scope syncs it will work as if it was the original scope, except on just the one table, and upon completion that scope's individual knowledge will correct reflect just the table it represents.  

    Is that correct?  Does it matter if there were un-resolved conflicts when I split them?  Do I need to do the simulated sync you spoke of or can I just let it sync normally?

    Thanks again!


    Monday, January 12, 2015 5:42 PM
  • sorry, let me clarify.

    the sync knowledge is maintained per scope. it has no idea what the other scope has synched. if a table belongs to more than one scope, none of the scopes knows what has been synched by another.

    what am saying is that if you know your tables are up to date (nothing to upload, nothing to download), you may simulate a sync on the new scope. 

    the new scope's sync knowledge is of course empty the first time, by initiating a sync it will detect there are rows to be synched. when you skip the actual application of the changes, you're tricking Sync Fx into storing the sync knowledge that it has synched those rows.

    so you're starting with a fresh sync knowledge just for that table in that scope.

    Wednesday, January 14, 2015 1:18 AM
  • Hi June,

    OK, that's not what I was looking to do, though it's an improvement over the current situation.  Some customers have one SQL db located far away from another, and, more importantly, over a network with less than ideal speed.  So that sync where I can simulate a good sync will still be a long time because I don't get called in applyingchanges until sync has fetched all the candidate rows.  That's still faster than letting the initial sync run and dealing with each conflict one at a time (all insert-insert), but its still going to be relatively slow on larger databases.

    An additional problem occurs:  There's no way for me to force the customers to have a 'clean' system with no conflicts, and if I followed these suggestions rows in conflict would be lost (would not attempt to sync unless touched again).

    That's why I wanted to be able to understand how the knowledge 'knew' what tables it applied to.

    What I thought you were suggesting was that I take the existing knowledge and duplicate it for the new scopes.  The original scope is never going to be run again, but I would have several new scopes that each had a copy of the original knowledge.  I thought the next sync of each scope would then fix up that copy of the knowledge. 

    I very much appreciate your input!


    Wednesday, January 14, 2015 2:17 PM
  • you should explore SyncKnowledge.Combine then. But the sync knowledge will have some stale data in there that will never be used again.
    Thursday, January 15, 2015 12:49 AM