none
Suggestions for handling batching during initial sync (SQL 2008 change tracking, SQL CE) RRS feed

  • Question

  • Hi, maybe some of you with more sync framework experience could provide some suggestions on this one. Skip to the questions at the bottom if the background info is unnecessary :)

    Environment
    We are using the sync framework to syncronize a local SQL CE database with a SQL Server 2008 database over a WCF web service. We use built-in change tracking on the SQL 2008. We use sync framework 2.0, but since the initial code was generated by the Visual Studio wizard, it uses SQLCeClientSyncProvider and DBServerSyncProvider - this enables us to use the native SQL server change tracking, but prevents us from using the newer batching functionality of other ver 2.0 providers.

    Implementation
    We have the CompactFormatterPlus handling the serialization (thanks to JuneT for invaluable information here and on the blog), and we implemented batching using the SelectNewAnchorCommand and a stored procedure built on one of the several good examples available (using the native change tracking functions, not timestamps). We also have row filters implemented on the selects, by letting the application modify the wizard-generated SQL at runtime - basically we just include a customer ID in the query, to let each customer only get his/her records from the tables.

    Problem
    Batching is not in effect for the initial sync, or for the re-initialized sync - this is a known limitation. We want to work around this, as the initial sync is the heavy one. We can sort of enable the batching by modifying the generated SQL to take batching into account, but the problem is more like a conceptual problem: If we enable batching also for the initial sync, what do we use to divide records into batches? The "standard" stored procedure will use the CHANGE_TRACKING_MIN_VALID_VERSION, but this is no good for initialization. Using a range from 0 to CHANGE_TRACKING_CURRENT_VERSION is an option, but given the fact that a table could contain millions of records, where maybe only 1000 is relevant for this specific customer (remember the customer ID on the row filter) it seems like a bad idea in terms of number of batches and overhead.

    Questions
    We are considering to handle the initial sync by using completely different anchors, e.g. using the row count of the customer-specific select query (with some static sorting of course). We would then select  the records in batches, and then manually write the relevant final CHANGE_TRACKING_CURRENT_VERSION to the client DB afterwards, to prepare for sub-sequent syncs. 
    Would this work - are the anchors as simple as they seem, so their contents could be anything, as long we handle it during the sync session (in SQL and SP), or will something inside the framework get mixed up by using custom anchors during initialization?
    Can we modify the parameter list of the usp_GetNewBatchAnchor SP to accept the extra parameter required to mimic the row filter?
    Are there better ways of doing this, or other work arounds for the initial sync, batching and row filters combination?

    Any input greatly appreciated, thanks,
    Jesper

    Monday, August 30, 2010 1:53 PM

Answers

  • have you looked at the approach here? http://social.microsoft.com/Forums/en-US/syncdevdiscussions/thread/a2580bed-e606-4174-a075-2822f1939113

    afaik, the GetNewBatchAnchor is just like any other stored procedure, you can add additional parameters if you want. as for the anchors, i think you'll be fine for as long you stick to the data type (unless you want to override the anchor methods as well to save to your own anchor table) and it's incremental (sync is based on incremental changes from a point onwards).

    apart from using the built-in SQL Change Tracking, any other reason why dont you just moved to the SqlSyncProvider/SqlCeSyncProvider which natively supports batching and snapshot initialization?

    • Marked as answer by Js_dk Tuesday, August 31, 2010 9:09 AM
    Monday, August 30, 2010 2:57 PM
    Moderator

All replies

  • have you looked at the approach here? http://social.microsoft.com/Forums/en-US/syncdevdiscussions/thread/a2580bed-e606-4174-a075-2822f1939113

    afaik, the GetNewBatchAnchor is just like any other stored procedure, you can add additional parameters if you want. as for the anchors, i think you'll be fine for as long you stick to the data type (unless you want to override the anchor methods as well to save to your own anchor table) and it's incremental (sync is based on incremental changes from a point onwards).

    apart from using the built-in SQL Change Tracking, any other reason why dont you just moved to the SqlSyncProvider/SqlCeSyncProvider which natively supports batching and snapshot initialization?

    • Marked as answer by Js_dk Tuesday, August 31, 2010 9:09 AM
    Monday, August 30, 2010 2:57 PM
    Moderator
  • Thanks June. I saw that article, but was kind of focused on wanting the sync framework to handle it. Maybe I should give it a second thought, though, seems like a nice solution after all.

    > any other reason why dont you just moved to the SqlSyncProvider/SqlCeSyncProvider

    It is just because we want the SQL 2008 change tracking. It is a large, global database which many customers depend upon both for PC apps (soon using syncfx) as well as our online services. We are concerned that the sp+tables+triggers approach will get messy and have too much impact on the overall performance. Still having a hard time figuring out why the syncframework decided that native change tracking and support for batching had to be a question of one or the other.

    Best reards,
    Jesper.

    Tuesday, August 31, 2010 7:29 AM
  • moving away from the built-in SQL 2008 Change Tracking allows Sync Fx to support SQL flavors lower than v2008 and extend it to other database platforms as well.

    It would be nice if they allow it as an option though similar to the way Sync Fx v2.1 bulk procedures taking advantedge of table-valued parameter feature of SQL 2008 and SQL Azure.

    Tuesday, August 31, 2010 7:47 AM
    Moderator