locked
How to: Pre-Create SSCE 3.5 Client Database using filters RRS feed

  • Question

  • Hi,

    we are using SSCE 3.5 Databases as Client Databases, the server uses SQL Server 2008 with change tracking enabled. We are filtering the data being sent to the client. Since we still have a large amount of data (> 500 000 rows ) being synced to the devices when initial synchronization takes place, we want to let the server take care of initializing the client databases.

    My question is now. How can we accomplish creating the client sdf's automatically on the server-side according to our filter restrictions? The goal here is to have mechanism like it is used in VS2008 when creating a local database cache. But as far as I know, the wizard for the local database chache doesn't support row filters on tables.
    I already built myself such a client database. But then I still had the problem, that on the initial sync, the server sent me all data, which is already in the client database, back. How can I prevent the server from doing that, because otherwise pre-creating the client doesn't make much sense to me. Regarding that the is coming down on the initial sync, anyway.

    Thanks in advance for any answers or tips, how to solve our problem!

    Peter

    • Moved by Hengzhe Li Friday, April 22, 2011 3:20 AM (From:SyncFx - Microsoft Sync Framework Database Providers [ReadOnly])
    Tuesday, April 21, 2009 6:47 PM

All replies


  • are you just trying to get a filtered client ?  filters can be done in two ways ( static and dynamic ) where static needs to be done at the server side with the fixed filter clause in the seelctIncrement change queries. BOL has details in this session: ms-help://MS.SynchronizationServices.v1.EN/syncdata1/html/15abacc8-a243-4570-86e9-da95bb5bfddd.htm

    if you want to generate clients at the server and then send the sdf file to the remote client machines (devices), then you need to follow the same logic descrived in the above BOL session to get the initial data ( with the right filters ). also the sync application on the clients ( desktop machines or devices ) need to have the right filter for this particular client db in order to keep the data consistent on this client.

    thanks
    Yunwen
    This posting is provided "AS IS" with no warranties, and confers no rights.
    Saturday, May 2, 2009 4:42 AM
    Moderator
  • Hi Yunwen,

    thank you very much for your reply.

    I've already improved my work on the initial Sync. I found a very good post in this Forum by OverloadedOverrides. (http://social.msdn.microsoft.com/Forums/en-US/uklaunch2007ado.net/thread/4fd6df1a-52bb-47e9-9ff1-6427bf577b92) This post helped me a lot to understand what is happening when initiallly snycing a new clientdatabase.

    So right now I have solution that "works" with batching. I have a client implementation that runs on the server-side, which is responsible for initializing a new database. It creates a database file according to my filter restrictions. After the file is created I distribute it to the real clients via http download.

    But now I have a new problem. As described in the post linked above. Using sql server change tracking, results in a very poor segmentation of data for batching, when cleanup once has run on the server. The simple stored procedure (taken from the sample for batching on msdn) I now use for creating the batches calcutlates several batches, but because of cleaned up server data, lots of rows have the same version. This results in a few large batches followed by lots of empty or very small batches. And the problem I wanted to work around is back again.

    I don't want to use the solution posted by OverloadedOverrides. He uses an additional table to calculate the batches. And as far as I understand his solution, he uses custom Insert Commands on the SyncAdapters to select the inserts from the server. But I want to use the SqlAdapterBuilder to generate my Adapters!

    Is there a different solution for the calculation, and especially the selection of the inserts from the server that allows a better segmentation?

    thanks,

    Peter
    Monday, May 4, 2009 9:52 AM