Sync framework tracking table issue RRS feed

  • Question

  • I am using syn framework for the first time.

    I have a table with 37 million record, when I call/execute  the

    Apply((System.Data.SqlClient.SqlConnection)provider.Connection); method of the

    SqlSyncScopeProvisioning class it is throwing the following error "This SqlTransaction has completed; it is no longer usable."

    Is their any restrictions for the number of records in the table ? or is their any other way to handle this scenario.

    Any help will be really appreciated.

    Tuesday, August 31, 2010 2:45 PM


All replies

  • you must be running thru this same issue: http://social.microsoft.com/Forums/en-US/syncdevdiscussions/thread/5695c5da-c1bc-4c93-8ce8-d3366d18355e/ 

    i suspect the timeout is happening when populating the tracking table.

    if you're not yet using Sync Fx v2.1, you might want to upgrade to it. The SqlSyncProvider and SqlSyncScopeProvisioning has a new property CommandTimout that you can specify.


    Tuesday, August 31, 2010 3:16 PM
  • Hi June

    We have situation where we need to synchronize millions of records. We found that the Sync Framework takes quite a long time (days!) to complete this process. To overcome this we devised a way where a service on the server side will export the data using BCP to table specific files. The client will download these files and will import the data on the client side using BCP. We are interested in one way sync from the Server to the client.

    We were using Sync Framework 2.0 and since it could not handle table Schema changes we implemented custom logic to manage schema changes. We would typically drop the table on the client and import the data again using BCP. So far so good. We Recently migrated to MS Sync Framework 2.0, thanks to a issue and its resolution provided by June T (http://social.microsoft.com/Forums/en-US/syncdevdiscussions/thread/00b5fb80-ee43-4a94-b401-76408187d22d?prof=required ). However we have retained the custom logic to manage schema changes

    .We have run into a new issue now. I would try to explain the scenario as best as I can.

    At some point the data gets exported to a set of files on the server side. The client downloads and imports the data using BCP at some interval. After successfully importing the data, we are faking the synchronization by clearing the dataset object of type DbSyncContext. We clear this because the data is already imported and we do not want the Sync Framework to start downlaoding the entire data again..

    During this interval there are DML operations on the server data. When the sync process runs next time, it does not pick up the modified/added/deleted rows till that point in time since we faked the previous sync. However in the subsequent synchronizations, newer changes/additions/deletes are synched and not the ones that were missed in the previous step.


    Wednesday, September 15, 2010 3:51 PM
  • have you tried sending a SQL CE database created using GenerateSnapshot to initialize your client database instead?
    Thursday, September 16, 2010 1:10 AM
  • Hi June

    Thank you again.

    It is very difficult to re-architect at this point of time. In the current approch which we have taken can you kindly suggest us a workaround where we can set the time stamp on the server to indicate that the data needs to be pulled down to the client ?.

    Thursday, September 16, 2010 10:32 AM
  • unfortunately, SqlSyncProvider is not based on anchors, so you can't just set the metadata to prior timestamp similar to the way the offline providers work.

    you are not getting the changes initially because as far as Sync Fx is concerned, the changes were synched, except that you cleared the actual dataset.

    you can "touch" the data by doing dummy updates so their metadata is updated and be detected as changed. however, this change will also be visible to other clients.

    the other suggestion i have with using snapshot will have its metadata updated with the initial data already, so that when its sync, existing data no longer needs to be downloaded.

    Thursday, September 16, 2010 2:01 PM
  • Thanks,

    Is their any example or sample project using sanpshot ?

    Monday, September 20, 2010 10:56 AM
  • you'll find some sample code in the documentation. just search for GenerateSnapshot.

    the steps would be :

    1. provision server scope

    2. provision a SQL CE client

    3. generate snapshot from SQL Ce client

    4. provision client

    5. synchronize new client with snapshot

    6. synchronize new client with server  // data will not be downloaded since the client got the data from snapshot

    here's a quick code view from the docs:

    // Create a snapshot from the SQL Server Compact database, which will be used to
    // initialize a second Compact database. Again, this database could be provisioned
    // by retrieving scope information from another database, but we want to 
    // demonstrate the use of snapshots, which provide a convenient deployment
    // mechanism for Compact databases.
    SqlCeSyncStoreSnapshotInitialization syncStoreSnapshot = new SqlCeSyncStoreSnapshotInitialization("Sync");
    syncStoreSnapshot.GenerateSnapshot(clientSqlCe1Conn, "SyncSampleClient2.sdf");
    // The new SQL Server Compact client synchronizes with the server, but
    // no data is downloaded because the snapshot already contains 
    // all of the data from the first Compact database.
    syncOrchestrator = new SampleSyncOrchestrator(
      new SqlSyncProvider("filtered_customer", serverConn, null, "Sync"),
      new SqlCeSyncProvider("filtered_customer", clientSqlCe2Conn, "Sync")
    syncStats = syncOrchestrator.Synchronize();
    syncOrchestrator.DisplayStats(syncStats, "initial");
    Monday, September 20, 2010 2:11 PM
  • Thank you for the timely reply. I really appreciate that.

    We are using sql server 2008 on both the server and client machines, is CE compatible in this case ?

    In this link http://msdn.microsoft.com/en-us/library/dd937880(v=SQL.110).aspx the statement

    "After one client database has been initialized by using full initialization", what does this full initialization mean ?

    Since the database on the server is minimum 8 GB , is it meaningful to use compact database ?

    And you have mentioned in your post to "provision a SQL CE client". How to do this ?.

    As per my understanding,

    We should first provision the server database than create a snapshot of that database using SqlCeSyncStoreSnapshotInitialization.GenerateSnapshot(); method, which will create a CE in the specified path, which creates a CE database. Now provision the client DB and then sync the client with the snap shot. Is that correct?.

    Tuesday, September 21, 2010 7:07 AM
  • SQL CE can sync with either SqlServer or SQLExpress or any other compatible databases for as long as you have a provider.

    full initialization means creating all the table schemas,  sync objects and loading the data.

    if you have installed Sync Fx 2.1, there is a tutorial, look for Walkthrough: Provisioning a SQL Compact Client and Walkthrough: Provisioning another SQL Compact Client using Snapshot initialization.

    as i have mentioned in the steps above, you need to provision a SQL CE client first before you can generate a snapshot. The snapshot is generated from an existing SDF file.

    to clarify further:

    1. provision server scope in your server

    2. provision a SQL CE client - this will generate an SDF file  (see Walkthrough: Provisioning a SQL Compact Client)

    3. generate snapshot from SQL Ce client - this is generated from the SDF in step 2 (Walkthrough: Provisioning another SQL Compact Client using Snapshot initialization)

    4. provision client - this is your SQLServer on the client. (see Walkthrough: Provisioning a SQL Express Client, this works with SQL Server as well)

    5. synchronize new client with snapshot - synchronize client in step 4 to the SDF in step 3, this will load the data from the SDF snapshot to your clients SQLServer

    6. synchronize new client with server - client in step 4 synching with server in step 1, this will load all incremental changes since the snapshot was generated.



    Tuesday, September 21, 2010 7:50 AM
  • But the compact db has a size limitation correct?

    Since the database on the server is minimum 8 GB , is it meaningful to use compact database ?


    Tuesday, September 21, 2010 10:32 AM
  • i think i have already ask you in your other thread if you'r applying filters for your client copies. if you're not applying any filter and will be synching the entire 8gb, just follow the other approach of provisioning the server, backing up the database, restoring a backup on the client and doing Post restore fixup.
    Tuesday, September 21, 2010 2:29 PM
  • Hi June

    I really appreciate your advice and timely responce. We had success using the Sync Framework.

    We have installed the Sync framework 2.1.

    However, when we run the incremental synchronization we are getting the following error on the server.

    Retrieving the COM class factory for component with CLSID {EC413D66-6221-4EBB-AC55-4900FB321011} failed due to the following error: 80040154.

    Source: Microsoft.Synchronization
    Method: DoOneWaySyncHelper
    Stack Trace: at Microsoft.Synchronization.KnowledgeSyncOrchestrator.DoOneWaySyncHelper(SyncIdFormatGroup sourceIdFormats, SyncIdFormatGroup destinationIdFormats, KnowledgeSyncProviderConfiguration destinationConfiguration, SyncCallbacks DestinationCallbacks, ISyncProvider sourceProxy, ISyncProvider destinationProxy, ChangeDataAdapter callbackChangeDataAdapter, SyncDataConverter conflictDataConverter, Int32& changesApplied, Int32& changesFailed)
       at Microsoft.Synchronization.KnowledgeSyncOrchestrator.DoOneWayKnowledgeSync(SyncDataConverter sourceConverter, SyncDataConverter destinationConverter, SyncProvider sourceProvider, SyncProvider destinationProvider, Int32& changesApplied, Int32& changesFailed)
       at Microsoft.Synchronization.KnowledgeSyncOrchestrator.Synchronize()
       at Microsoft.Synchronization.SyncOrchestrator.Synchronize()
       at Continuum.Local.Client.SynchronizationHelper.SynchronizeProviders(KnowledgeSyncProvider localProvider, KnowledgeSyncProvider remoteProvider)
       at Continuum.Local.Client.SyncEngine.SynchronizeTableScope(SyncServiceProviderProxy destinationProxy, SqlSyncProvider clientSyncProvider, Boolean Provision)


    Any help will would be really appreciated.

    Tuesday, November 2, 2010 5:50 AM
  • is your server 32bit or 64bit? please make sure that you ran the matching Sync Fx installer.

    Tuesday, November 2, 2010 6:05 AM
  • Our server is a 64 bit machine.
    Tuesday, November 2, 2010 9:32 AM
  • have you installed the corresponding 64 bit Sync Fx bits? what was the target for your project in VS? just confirmt that they match.
    Tuesday, November 2, 2010 10:03 AM
  • I have a similar situation involve SQL Server 2008 Express on the client syncing with a SQL Server 2008 db except I am using WCF service as the provider for the main source db.   I did the following..

    1. Created a database on the server that matched in schema the db I will have on the client.  This is the baseline db for the client.
    2. Populated the db with data from the source which is also on the same server using SSIS.
    3. After population, I backed up and restored the baseline database on the client.
    4. Then I applied provisioning on both and the tracking tables were made.  I had the timeout set to 0.
    5. The source db's provider is a WCF server with batching. 

    The issue I am seeing is that the WCF service timeouts because of the millions of records on the source db during the enumeration on one table.   I know the issue is the enumeration being too long the main server which causes WCF to timeout. What can I do to fix this?  

    Friday, November 5, 2010 10:04 AM
  • have you tried extending your WCF timeout then?

    also, have you tried these steps instead?

    1. Create your database
    2. Populate using SSIS
    3. Provision Scope
    4. Back up DB
    5. Restore to Client
    6. Perform Post Restore Fixup
    7. Sync.

    Friday, November 5, 2010 3:09 PM
  • I have done everything up to step 6 but one of my tables that has about 6 million + records is causing a time out on the WCF side.  Do you think that I can batch the number of rows coming back using SelectNewAnchorCommand method (http://msdn.microsoft.com/en-us/library/bb902828.aspx)?
    Friday, November 5, 2010 3:29 PM
  • which provider are you using?
    Friday, November 5, 2010 3:44 PM
  • I was originally using the SqlSyncProvider but in the process of changing the provider to DBSyncProvider.
    Friday, November 5, 2010 3:45 PM
  • Correction: I meant DbServerSyncProvider 
    Friday, November 5, 2010 3:48 PM
  • Hi June

    I am getting the following error when the sync process ends.

    An unsecured or incorrectly secured fault was received from the other party. See the inner FaultException for the fault code and detail.

    The message could not be processed. This is most likely because the action 'http://tempuri.org/ISyncWebService/EndSession' is incorrect or because the message contains an invalid or expired security context token or because there is a mismatch between bindings. The security context token would be invalid if the service aborted the channel due to inactivity. To prevent the service from aborting idle sessions prematurely increase the Receive timeout on the service endpoint's binding.

    Can you kindly tell what may be the possible reason.

    Also the files which gets created under the BatchDirectory("websynch_XXXXXXXXXXXXXXXXXXXX ") are not geting deleted some times and this is blowing up the space on the client machine.

    Your help will be really appreciated.




    Tuesday, November 9, 2010 12:01 PM
  • looks to me like more of a WCF exception than Sync Fx.  have you tried the suggestion in the exception itself? the temp files are probably not getting deleted/cleaned up because of the exception you're getting.
    Tuesday, November 9, 2010 3:48 PM
  • Hi June

    I am getting the following error

    Error in GetChanges

    () method

    Cannot access the spooling directory 'C:\Windows\TEMP\sync_f93f899775544278b6510f95968856cae829e01cc20b41dc8508fb9d5caf4de7' for synchronization

    Wednesday, November 10, 2010 12:34 PM
  • have you checked for permissions on the Temp folder?

    Wednesday, November 10, 2010 1:50 PM
  • First of all I dont understand why the batch files are getting created under Temp when I am setting the batch directory path to the application folder.


    Wednesday, November 10, 2010 2:32 PM
  • Hi JuneT,

    I am facing the same issue. I did all the steps you mentioned except the step 6 : Perform Post Restore Fixup . Can you please tell me what this Post Restore Fixup means ? I don't know what needs to be done in this step.

    Please help.




    Wednesday, November 17, 2010 6:17 AM
  • Hello Gladson,

    Did you find a solution to your question? I am having a similar issue and I am not sure what is going on.  From some of the other items that I have found, it sounds to be an issue with the time difference between the client and server be > 10 minutes.   That does make sense for us as we are having people from other time zones sync to an east coast server.  I have not found any way to fix this yet, and have to assume that you can use sync across time zones.


    Please let me know if you figure anything out with this. Any help would be greatly appreciated.



    Friday, December 3, 2010 3:55 PM
  • No Zeth, I am still facing the problem. The problem is that istead of fetching the incremental changes the synch framework is fetching all the records every time and because of this we are facing lot of other issues. In our loacl environment everything seems to be working fine. I will really think on your lines about the timezone and keep you updated. If you find any solution kindly let me know.
    Tuesday, December 14, 2010 9:37 AM
  • Tuesday, December 14, 2010 11:13 AM
  • But why does the sync framework bring all the records when it has to get only the incremental changes?
    Tuesday, December 14, 2010 12:15 PM
  • is the sync failing? if it is, then it simply retries it again. are you seeing conflicts in the ApplyChangesFailed? could be that a conflict is detected and its being retried on next sync.


    Tuesday, December 14, 2010 3:16 PM
  • Sync is not failing but I am seeing conflicts in the ApplyChangesFailed and if there is no error I am doing a ApplyAction.RetryWithForceWrite.

    As per my understanding conflicts occur after the data is downloaded from the server on to the client and while updating the client database. At the point where the framework gets the data, its fetching all the records.

    In our case Sync framework fetches the incremental changes successfully until it fails for a particular table(for a table which has 25 lacs records)  after this failure for the subsequent tables it fetches all the records instead of incremental changes.

    Is there anything which I have to do to overcome this problem?

    Wednesday, December 15, 2010 5:45 AM
  • Hi June

    I have a problem with the schema changes. When the schema of a table on the server is updated/deleted the same schema is getting reflected on the client However if a new column is added I am getting the following error : SQL exception "Invalid Column Name", Procedure tableName_bulkinsert.

    Wednesday, April 6, 2011 12:25 PM
  • did you provision a new scope for the new schema? please note that if you can't have different scopes having different row counts against the same base table.

    for example, if you first provision a scope with 5 columns, these same columns will be reflected in the selectchanges sp, the bulk table type, and the insert/update stored procs including the bulk SPs.

    now when you provision a second scope against the same table using the SetCreateProceduresForAdditionalScopeDefault with the new column included, let say the included columns is now 6 columns, the additional selectchanges will be created but the bulktype, and the insert/update stored procs including the bulk ones will not be updated to include the new column.

    multiple scopes defined against the same table would actually share the same bulk type and insert/update SPs.

    Wednesday, April 6, 2011 3:07 PM
  • Thanks June, for your valuable inputs.

    We have Implemented data syncing successfully and it works fine for some clients and for the rest we are facing a problem in syncing a particular table which is quite large, every other table in the database are under sync without any errors. We enabled the trace on the server and we fond some errors:

    ·         Type 'Microsoft.Synchronization.Data.DbSyncException' with data contract name 'DbSyncException:http://schemas.datacontract.org/2004/07/Microsoft.Synchronization.Data' is not expected. Add any types not known statically to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding them to the list of known types passed to DataContractSerializer.

    The communication object, System.ServiceModel.Channels.ReplyChannel, cannot be used for communication because it has been Aborted.

    ·         There was no channel that could accept the message with action 'http://tempuri.org/ISyncWebService/BeginSession'.

    ·         The communication object, System.ServiceModel.Channels.TransportReplyChannelAcceptor+TransportReplyChannel, cannot be used for communication because it has been Aborted

      Any help would be really appreciated.

    Thursday, April 14, 2011 6:44 AM