{"Exception of type 'System.OutOfMemoryException' was thrown."} RRS feed

  • Question

  • Hi

    During a call to  DbSyncProvider.GetChanges I am get the following stack trace for a System.OutOfMemoryException

    An exception

       at System.Data.Common.Int64Storage.SetCapacity(Int32 capacity)
       at System.Data.DataColumn.SetCapacity(Int32 capacity)
       at System.Data.RecordManager.set_RecordCapacity(Int32 value)
       at System.Data.RecordManager.GrowRecordCapacity()
       at System.Data.RecordManager.NewRecordBase()
       at System.Data.DataTable.NewUninitializedRecord()
       at System.Data.RecordManager.CopyRecord(DataTable src, Int32 record, Int32 copy)
       at System.Data.RecordManager.ImportRecord(DataTable src, Int32 record)
       at System.Data.DataTable.ImportRow(DataRow row)
       at Microsoft.Synchronization.Data.Server.DbServerSyncProvider.EnumerateChanges(SyncGroupMetadata groupMetadata, SyncSession syncSession, IDbTransaction transaction, EnumerateChangeType changeType, SyncSchema traceSchema)
       at Microsoft.Synchronization.Data.Server.DbServerSyncProvider.GetChanges(SyncGroupMetadata groupMetadata, SyncSession syncSession)

    The changeset is large but I have been able to create larger datasets (manually outside of sync) without encountering out of memory exceptions on a data table.

    Large datasets are an edge case but one that will happen e.g the first time that a user syncs.

    Any suggestions (best practises) on how to handle large change datasets would be appreciated.

    For example rather than having one sync group would dividing up the sync into smaller PK-FK safe sync groups help the synce framework when it is creating it's change dataset?


    • Moved by Hengzhe Li Friday, April 22, 2011 3:22 AM (From:SyncFx - Microsoft Sync Framework Database Providers [ReadOnly])
    Monday, April 27, 2009 1:10 PM