locked
Synchronize mutiple tables with complex structure RRS feed

  • Question

  • Hello,

    I have following scenario

    - I have big database (sql 2005) containing structured information (models). The data for these models comes from six different tables and is structured like a tree
    - Database contains 20000 different models and all have version numer that increase when eny part of the model is changed
    - I need to synchronize these models to windows client using win forms application that uses SQL CE and sync framework
    - I should transfer complete models, if transfer fails in some point, client has only complete models in CE side
    - Transfer is only from server to client

    I would like to fill dataset with one model in server side(from the tables) and send those one by one to the client. The knowledge would be model version number.

    How should I start implementing this, custom providers?

    BR

    -Tero
    • Edited by Tero-T Friday, April 24, 2009 10:36 AM
    • Moved by Max Wang_1983 Thursday, April 21, 2011 1:22 AM forum consolidation (From:SyncFx - Technical Discussion [ReadOnly])
    Friday, April 24, 2009 10:35 AM

Answers

  • Hi -

    You are right about the fact that we do batching at the table level. Currently we do not support synchronizing entities and also batching at the entity level. The only workaround that I can think of is that when applying changes we wait till we receive all batches so you can send all your data in batches that fit your memory/size requirement. We then try to apply all these changes inside a single transaction. If we are unable to apply a change ( row ) , we will raise a OnChangeFailed event. Throwing an exception inside this event will cause the entire transaction to be rolled back and no changes applied. This of course limits your scenario to all-or-none conditions and that is a drawback.

    Thanks
    Deepa
    Deepa ( Microsoft Sync Framework)
    • Marked as answer by Tero-T Monday, July 6, 2009 5:56 AM
    Thursday, July 2, 2009 6:42 PM
    Answerer

All replies

  • You have server data on the SQL 2005 and you would like the sync application to get them down to SQL CE client, right?  If so please check out Sync Services for ADO.NET, http://msdn.microsoft.com/en-us/sync/bb887608.aspx.

    Thanks.
    Leo Zhou ------ This posting is provided "AS IS" with no warranties, and confers no rights.
    Thursday, June 4, 2009 9:57 PM
    Answerer
  • Thank you for your answer. Yes, it is SQL 2005 --> CE. I have used that one but the problem there is the table "structure". I need to transfer entities that contain multiple tables at once. So that if the transfer breaks down, only complete entities are transferred.

    E.g. I have 1000 models to transfer. Those models contains data in 6 different tables. Lets assume that the transfer breaks in the middle. I have received 551 models. I want that all of the six tales contain the 551 model data.

    I can add the tables to group so that all the tables are updated in one transaction BUT there is a problem. I have too much data in the tables. I can't download them in one batch. But I can't batch the data either because then I will receive uncomplete data.

    I should generate data model that contains all the 6 tables and they are filled with one model info and sended to client. This would be repeated 1000 times. This means that I need to create custom provider?

    Thanks
    Thursday, June 25, 2009 9:46 AM
  • Sync Framework 2.0 CTP provides full support of Batching which preserves data integrity and should solve the issue you have. Please download it  from http://www.microsoft.com/downloads/details.aspx?FamilyID=89adbb1e-53ff-41b5-ba17-8e43a2e66254&displaylang=en  . The Sync Framework SDK is part of the installation package and you can check out the related topics in the SDK regarding to “How to: Deliver Changes in Batches”.

     

    Thanks,

     


    Ann Tang
    Thursday, June 25, 2009 9:15 PM
  • Thanks for the replies. I have seen samples of btaching but it doesn't fit here. The problem is that batching is done table level, not with one entity level. Again example:

    I have 1000 models. I need to transfer 1 model at time from server to client. One model might have 1 --> 10000 records in all of the six tables. So I would need to transfer data from those 6 tables at once to receive one model. This would be repeated 1000 times or so to get all of the models. This way I can be sure that when opening the model from UI, it has all the records in client side and not missing data from the tables. I don't want to do traditional table to table updates because this will cause problems in UI.



    Thursday, July 2, 2009 12:20 PM
  • Hi -

    You are right about the fact that we do batching at the table level. Currently we do not support synchronizing entities and also batching at the entity level. The only workaround that I can think of is that when applying changes we wait till we receive all batches so you can send all your data in batches that fit your memory/size requirement. We then try to apply all these changes inside a single transaction. If we are unable to apply a change ( row ) , we will raise a OnChangeFailed event. Throwing an exception inside this event will cause the entire transaction to be rolled back and no changes applied. This of course limits your scenario to all-or-none conditions and that is a drawback.

    Thanks
    Deepa
    Deepa ( Microsoft Sync Framework)
    • Marked as answer by Tero-T Monday, July 6, 2009 5:56 AM
    Thursday, July 2, 2009 6:42 PM
    Answerer
  • Thank you Deepa. That was my conclusion also, thanks for confirming it.

    -Tero

    Monday, July 6, 2009 5:57 AM