locked
Batching required in DbPeerSyncProvider RRS feed

  • Question

  •  

    CTP2 DbPeerSyncProvider works good.  However it will be nice to add Batching support using .BatchSize property as other serverprodiver does.  I have unique situation that few of my tables likely to 2G/per table when i use sync i get memory out of exception i only have 3G available memory.  If we have batching the dataset will be smaller than i beleive we wont face this error.  Here is the error for your reference:

     

    System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
       at System.Data.Common.Int32Storage.SetCapacity(Int32 capacity)
       at System.Data.RecordManager.set_RecordCapacity(Int32 value)
       at System.Data.RecordManager.GrowRecordCapacity()
       at System.Data.RecordManager.NewRecordBase()
       at System.Data.DataTable.NewRecordFromArray(Object[] value)
       at System.Data.DataTable.LoadDataRow(Object[] values, Boolean fAcceptChanges)
       at System.Data.ProviderBase.SchemaMapping.LoadDataRow()
       at System.Data.Common.DataAdapter.FillLoadDataRow(SchemaMapping mapping)
       at System.Data.Common.DataAdapter.FillFromReader(DataSet dataset, DataTable datatable, String srcTable, DataReaderContainer dataReader, Int32 startRecord, Int32 maxRecords, DataColumn parentChapterColumn, Object parentChapterValue)
       at System.Data.Common.DataAdapter.Fill(DataTable[] dataTables, IDataReader dataReader, Int32 startRecord, Int32 maxRecords)
       at System.Data.Common.DataAdapter.Fill(DataTable dataTable, IDataReader dataReader)
       at Microsoft.Synchronization.Data.Peer.SyncDbAdapter.FillFromReader(DataTable dataTable, IDataReader dataReader)
       at Microsoft.Synchronization.Data.Peer.DbPeerSyncProvider.EnumerateChangesInternal(SyncScopeMetadata scopeMetadata)
       at Microsoft.Synchronization.Data.Peer.DbPeerSyncProvider.GetChanges(SyncScopeMetadata scopeMetadata, PeerDataSyncSession PeerDataSyncSession)

     

    Thanks,

    Udai.

    • Moved by Max Wang_1983 Tuesday, April 19, 2011 6:16 PM Forum consolidation (From:SyncFx - Feedback [ReadOnly])
    Thursday, March 27, 2008 8:34 PM

Answers

  • Hi Udai thanks for bringing this up. we have realized this issue and the batching support is one of feature we are working in the futture release.

     

    Meanwhile, maybe you have already figured out, you will need to twick your selectQuery a bit so that only a smaller chunk of data was synced for a particular sync and use mutiple syncs to sync big changes, especially for the inital syncs.

     

    thanks

    Yunwen

    Sunday, August 3, 2008 5:59 AM
    Moderator

All replies

  • Hi Udai thanks for bringing this up. we have realized this issue and the batching support is one of feature we are working in the futture release.

     

    Meanwhile, maybe you have already figured out, you will need to twick your selectQuery a bit so that only a smaller chunk of data was synced for a particular sync and use mutiple syncs to sync big changes, especially for the inital syncs.

     

    thanks

    Yunwen

    Sunday, August 3, 2008 5:59 AM
    Moderator
  • Thanks Yunwen.

     

    FYI:

     

    I am using RTM sync release.  I made the workaround as follows:

     

    Assume your call looks like this:

    _provider.GetChangeBatch(batchSize, destinationKnowledge, out changeData);   before sending the response, read the dataset from changeData store it into local variable (dbsyncprovider must need session scope) and set Scopeprogress to null.  Now from client you can read table by table and populate back changeData from client.  This helps to transfer the large volume of data i tested over 80 Mb of dataset. 

     

    Thanks & Regards,

    Udai.

    Friday, August 8, 2008 12:18 AM