locked
Can multiple tables be included in a single batched sync session? RRS feed

  • Question

  • It is possible to group different tables together inside of a single batched sync session? I am having a problem with a sync that I believe should be batched, but the sync operation is failing.

    My test configuration is as follows. The whole configuration is based on the WebAppDemo application for the sync framework. I am running my sync service as a console application on my local PC. I am also running my client application on my local PC, but it is connected to a different database. I am using WCF for communication between the client and service over an endpoint configured with NetTcpBinding.

    The symptom I am seeing is this: two smaller sync requests that did not require batching worked just fine. A third sync request that likely requires batching appears to be "hung". About 30,000 records worth of data is supposed to be downloaded from the service to the client. The service believes it has gathered the data and returned it to the client. The client is waiting for the data to arrive, but it never does. Eventually, the WCF link times out and the sync fails.

    Here is how our sync “scopes” are currently configured. Our operational database consists of about 100 tables (most of them fairly small, but a few with several thousand records). I have grouped all of the tables into six "logical groups" of data that needs to be synced together and arranged the order of the tables within their group so that parent records are synced before child records are synced. Each group is represented in the "scope_info" table as a single scope name and consists of one or more tables.

    The first two sync groups I built synced just fine. However, they were fairly small groups and most of the tables had just a few records (one table in each group had about 500 records).

    The third group is the first group that has a substantial amount of data in it. When this particular sync is done, it has about 30,000 records in 14 tables (one table has 13,000 records). When I execute this sync, I wind up in a state where my sync service believes it has gather up all 30,000 records and sent them to the client and it is waiting for the client to respond with the next request (which is going to be an "End Session" in this case). On the client side, it is waiting for the call to GetChanges() to return, which never returns and the WCF link eventually times out. So it looks like my data is "stuck" in WCF.

    I turned on WCF event tracing and examined the output. I can see that the channel has been marked faulty, but the only exceptions I see are SocketExceptions with the message “The socket connection was aborted” and a “NativeErrorCode” of 2746. I don’t see any other error indications on the interface. No exception is being thrown (or received) by either the client or the service code.

    When I watch this operation on the debugger, I can follow the code through this call:

    changesWrapper.ChangeBatch = Me.peerProvider.GetChangeBatch(batchSize, 
              destinationKnowledge, changesWrapper.DataRetriever)
    
    Dim context As DbSyncContext = CType(changesWrapper.DataRetriever, 
                    DbSyncContext)
    
    If (context IsNot Nothing) AndAlso context.IsDataBatched Then
      Start building batch files
    End If
    

    The “IsDataBatched” flag is set to “False”, so MSF has apparently determined that the data should not be batched. I don’t know what kind of data object MSF is using to send data to/from clients (possibly DataTables?), but it seems doubtful to me that 30,000 records would fit inside of a 5000 KB buffer.

    Right now, my suspicion is that when this sync is executed it should actually be batched, but it is not and as a result we may be trying to write too much data to the WCF message buffer (though I don’t see a WCF error that shows this). I have set my MSF “batchSize” to “5000” (this is KB, correct?) and WCF “maxBufferSize” to “10 MB”, so if we were in fact batching the data, there should have been enough buffer space to hold it.

    When I did my prototype, I was able to batch data using these settings, but that was with a single table so the record size was always the same. Here, I would be batching 14 tables with different record sizes and with varying numbers of records per table. I actually plan to do something similar for the remaining sync groups, so I need to understand what I am doing wrong here so I can fix it.

    I would appreciate any suggestions or insights you can offer.

    Monday, June 14, 2010 4:15 PM

Answers

  • Multiple tables can be included in a sync batch.

    Have you set the maxReceivedMessageSize and maxArrayLength to larger size, such as 10MB, and also set  sendTimeout and receiveTimout if needed. Also, you may try to reduce the  batchSize a little bit to 1-2MB or even a few hundred KBs to make sure batching will be used and the max transfer size over the wire is under the limit. Often when a batch is serialized on the wire, the real size to transfer will increase according to different serialization format.

    Monday, June 14, 2010 6:11 PM
    Answerer

All replies

  • Multiple tables can be included in a sync batch.

    Have you set the maxReceivedMessageSize and maxArrayLength to larger size, such as 10MB, and also set  sendTimeout and receiveTimout if needed. Also, you may try to reduce the  batchSize a little bit to 1-2MB or even a few hundred KBs to make sure batching will be used and the max transfer size over the wire is under the limit. Often when a batch is serialized on the wire, the real size to transfer will increase according to different serialization format.

    Monday, June 14, 2010 6:11 PM
    Answerer
  • Thank you for your prompt response.  My maxReceivedMessageSize and maxArrayLength are both set to 10 MB and my sendTimeout and receiveTimeout are both set to 30 minutes on the service side and 60 minutes on the client side (I expect to adjust these a little later when I do the sync that will move the most data across the wire).

    I will change the batchSize to 1MB and see what happens.  I will let you know.

     

    Monday, June 14, 2010 7:36 PM
  • I dropped the batch size down to 1000 KB and the system did in fact batch the data and the application is no longer "hung" at the same point.  It is now "hung" someplace else :), but I think I can work on this one by myself for a while.

    Just FYI: the sync created four batch files occupying a total of 4,200 KB which one would think would fit inside a 5000 KB buffer but as you said, that may not serialize inside of 10 MB.  The absence of any kind of error message was really throwing me for a loop. 

    Now that I'm "unstuck" at that particular point, I can move on to the next.

    Again, my appreciation for your assistance.

     

    Monday, June 14, 2010 7:48 PM