Answered by:
Limit the amount of data using Sync Framework synchronisation

Question
-
I'm using Sync Famework functions to synchronise a database table with logbook messages at our customer with a copy of that table at our office database. It is possible to limit the amount of data that is send when the connection is reestablished after a period of time, and send the data in batches in stead of all at once?Tuesday, December 13, 2011 8:09 AM
Answers
-
Normally we want to set this maximum amount bigger than the amount of data that we are expecting during the session - if you set the maximum amount bigger than expected data, then there's no need to do batching. am i missing something?
Is it perhaps possible to investigate the amount of data before transfer? - you can simulate a sync and on the ChangesSelected event, you can figure out how many rows has changed down to the each table. When you get the row counts, simply throw an error and that would cancel the sync.
in your example above, if you have a 10mb change dataset, and you want to apply changes in 1mb increments, you will have to invoke the sync 10 times.
assuming you can apply the changes partially, how do you tell your user which rows comprise the 1mb you partially applied out of the 10mb change dataset? if you have related tables, this becomes even harder. do you just bring over the parent table first? how about their child records? 1Mb parent and zero child? 500kb parent and whatever you can fit in 500kb for the child rows?
- Marked as answer by RobvR -van Riet- Wednesday, December 14, 2011 11:10 AM
Wednesday, December 14, 2011 8:17 AM
All replies
-
you can apply filters to restrict the rows being synched (e.g., filter by customer id). you may also use batching.
which provider are you using?
Tuesday, December 13, 2011 8:17 AM -
Thanks for your response,
I have used the program http://msdn.microsoft.com/en-us/library/dd918848.aspx as an example, and removed the Compact databases. So i'm using SqlSyncProvider. I'm familiar with the filters, but how do I use batching? I have tried to set MemoryDataCacheSize, but that not seem to do what I expect.
- Edited by RobvR -van Riet- Tuesday, December 13, 2011 1:08 PM
Tuesday, December 13, 2011 8:38 AM -
can you share what exactly were you expecting in terms of batching?
batching is applied on a sync session... assuming you set batching to 1mb and you have a 10mb change dataset, batching simply transfers the changes in 1mb chunks. so 10 one megabyte chunks will be transferred, but the entire 10mb is applied as a single transaction. the batching is on the data transfer not the change application.
Wednesday, December 14, 2011 12:57 AM -
What I expect is that I can control the amount of data. The goal is that we do not want to influence the daily processes, so we can synchronize when the customer is not active, for example during the night. When this is going well we want to implement this solution for all customers that have a certain service level contract, so we can investigate what happened in the past in case of problems. What we do not want is that when there are problems with the logging process, and for example one customer has a extremely big logbook this delays or blocks our communication channels. My goal is that I can set a certain amount of data, in your example 1Mb, and every time I start the synchronization process 1 Mb is transferred. So when I read your reply batching is not the solution that I can use? Normally we want to set this maximum amount bigger than the amount of data that we are expecting during the session. Is it perhaps possible to investigate the amount of data before transfer? So that we can give a error message in stead of starting the sync session?Wednesday, December 14, 2011 7:26 AM
-
Normally we want to set this maximum amount bigger than the amount of data that we are expecting during the session - if you set the maximum amount bigger than expected data, then there's no need to do batching. am i missing something?
Is it perhaps possible to investigate the amount of data before transfer? - you can simulate a sync and on the ChangesSelected event, you can figure out how many rows has changed down to the each table. When you get the row counts, simply throw an error and that would cancel the sync.
in your example above, if you have a 10mb change dataset, and you want to apply changes in 1mb increments, you will have to invoke the sync 10 times.
assuming you can apply the changes partially, how do you tell your user which rows comprise the 1mb you partially applied out of the 10mb change dataset? if you have related tables, this becomes even harder. do you just bring over the parent table first? how about their child records? 1Mb parent and zero child? 500kb parent and whatever you can fit in 500kb for the child rows?
- Marked as answer by RobvR -van Riet- Wednesday, December 14, 2011 11:10 AM
Wednesday, December 14, 2011 8:17 AM -
Thanks for your answer, you have point me in the right direction. I use the ChangesSelected event to check the number of rows that are foreseen to be synchronized and throw an exception in case this amount is far too much, as you suggested. Furthermore I have set the batch size to a smaller amount to prevent errors in case of a bad communication line. I have checked the BatchApplied event to check if synchronization is split up in batches and it did, so also that looks fine.
- Edited by RobvR -van Riet- Wednesday, December 14, 2011 11:20 AM
Wednesday, December 14, 2011 11:18 AM