locked
Why is row size so big as to get error: "The row of size __KB ... cannot fit within the specified batch size of __KB? RRS feed

  • Question

  • I have a table with 4 columns: Guid, int, DateTime and VarChar(Max). In the VarChar column I store a Base64 version of a PDF document (I limit the size of the source file to 2Mb so it will fit in a sync "batch").

    Works great with small PDFs ... but not larger files like this one:

    - The original PDF I have converted to base64 is only 1.3MB on the hard disk.

    - If I save the base64 string to a text file, the text file is only 1.8MB on the hard disk (did this just to get a rough ballpark of the size in the table).

    - My BatchSize (RelationalSyncProvider.MemoryDataCacheSize) is set at 2.5Mb (.5Mb to allow for other fields,  SFx tracking fields, dataset overhead, etc.).

    Problem is that the Sync fails with this error:

    "... Microsoft.Synchronization.Data.DbSyncException ... The row of size 3779KB from table 'AttachmentItemContent' cannot fit within the specified batch size of 2560KB ..."

    Question:

    How did 1.3Mb turn into 3.7Mb?  A factor of 3 is steep!

    I understand that the batch size must be larger than the amount of space needed by the largest record ... but what "overhead" do I have to allow for?  Are there different row versions of the data in the dataset effectively doubling/tripling my data?  Setting the BatchSize equal to the approximate row data's size times 3 as a permanent rule seems excessive!

    How should I be able to approximate the correct BatchSize?

    Thursday, August 25, 2016 8:38 PM