none
N-Tier sync large data with wsHttpBinding throw System.InsufficientMemoryException RRS feed

  • Question

  • Hi,

    As the title suggests, I have a problem using the N-Tier synchronization, when I have to handle a table which store some image data (76Mb). I understood that the wsHttpBinding is not the best solution for this situation and the basicHttpBinding with the Streamed transfer it may be the good one but I did'nt find any resource about how my solution should be modified. There is all about web.config configuration or is more than that?

    Thank you.

    Tuesday, April 2, 2013 8:49 AM

Answers

  • I use batching for all of my transfers, I have no idea how much data they will decide to transfer.
    • Marked as answer by Cosmin_H Thursday, April 11, 2013 7:22 PM
    Wednesday, April 3, 2013 5:26 PM

All replies

  • I had issues with outofmemory, until I implemented batching, and gzipEncoder/compression in a similar scenario. 

    I also created my own CustomBinding - (buffered not streamed).. So I could specify all transfer parameters.

    This is the example I modified to work for me for Gzip...

    http://msdn.microsoft.com/en-us/library/cc138373(v=vs.90).aspx

    I think batching was the big one though, the compression may have just sped up the transfer. 




    • Edited by Racing_Prog Wednesday, April 3, 2013 2:32 PM
    Wednesday, April 3, 2013 12:56 PM
  • batching should help here.

    are the image sizes uniform on average? because the batch value cannot be lower than the actual row size. Sync Fx will not split a row into separate batches.

    Wednesday, April 3, 2013 1:51 PM
    Moderator
  • Actually, the table contain different type of files, from small text to large msi files. The scope of this table is to store application update files. Should I use the batching solution only for the update scope or may I use for all scopes defined (beside update scope which contain this table, I have different scopes used to sync dictionaries, transactions and so on).
    Wednesday, April 3, 2013 2:33 PM
  • I use batching for all of my transfers, I have no idea how much data they will decide to transfer.
    • Marked as answer by Cosmin_H Thursday, April 11, 2013 7:22 PM
    Wednesday, April 3, 2013 5:26 PM
  • Batching resolved my problem. 

    Thanks

    Thursday, April 11, 2013 7:22 PM