locked
N-Tier batching RRS feed

  • Question

  • I have a problem with Sync FrameWork batching in N-tier environment...

    I have seen sample application with WebSharingDemo and there you propose to serialize changes to file upload it and then deserialize and apply.

    1) Why mess with files when ChangeBatch class is serializabe and we can simply return it from Proxy Service function? (Is its "native" serialization better then XmlSerialiser?)

    2) I can't make this thing to work -)

    I have created 10 line of code sample that suppose do the job, and it did (synchronization is successful), BUT i can't get access to sync file that it creates, how do i suppose to upload it if it creates it and delete. The file is created I'm sure about it, i traced this with Process Monitor but even doing breakpoint on every line of my code doesn't stop code in condition when this file exist. So how do i suppose to get  ChangeBatch in file form?

    Here is my small code:

    connection1 = new SqlCeConnection("data source= server.sdf");
                connection2 = new SqlCeConnection("data source = client.sdf");
                SqlCeSyncProvider LocalProvier = new SqlCeSyncProvider("test", connection1,"sync");
                SqlCeSyncProvider RemoteProvider = new SqlCeSyncProvider("test", connection2, "sync");
                RemoteProvider.BatchingDirectory = @"C:\temp\";
                RemoteProvider.CleanupBatchingDirectory = false;
                RemoteProvider.MemoryDataCacheSize = 1;
    
                LocalProvier.BatchingDirectory = @"C:\temp\";
                LocalProvier.CleanupBatchingDirectory = false;
                LocalProvier.MemoryDataCacheSize = 1;
                LocalProvier.BeginSession(SyncProviderPosition.Local, null);
                RemoteProvider.BeginSession(SyncProviderPosition.Remote, null);
                uint size;
                SyncKnowledge knol;
                Object context;
                RemoteProvider.GetSyncBatchParameters(out size, out knol);
                ChangeBatch batch = LocalProvier.GetChangeBatch(size, knol, out context);
                SyncSessionStatistics stats = new SyncSessionStatistics();
                RemoteProvider.ProcessChangeBatch(ConflictResolutionPolicy.ApplicationDefined, batch, context, new SyncCallbacks(), stats);

    Friday, March 26, 2010 3:31 PM

Answers

  • afaik, it will not batch if the changes can fit in the cache but will still sync
    • Marked as answer by Alex Burtsev Monday, August 2, 2010 5:32 AM
    Monday, March 29, 2010 8:18 PM

All replies

  • have you tried following the steps here? http://msdn.microsoft.com/en-us/library/dd918908(SQL.105).aspx

    for n-tier scenarios, the service/proxy code takes care of streaming the files. you just call synchronize and it will take care of invoking the other steps (sessions, getchangebatch, processchangebatch, etc....)

    Saturday, March 27, 2010 4:58 AM
  • Of course i saw this article and all seen all Sync Framework samples. And yes i tried following them, me 10 lines code sample is in fact a rip of this code. But this code doesn't suite me, i need secure synchronization with authorization, encrypting batch files while they are stored on disk, and many other complex things/
    Saturday, March 27, 2010 11:29 AM
  • try listening for the BatchSpooled event.

    or if you use the sample code from http://msdn.microsoft.com/en-us/library/dd918908(SQL.105).aspx, you can take a look at the RelationalProviderProxy's GetChangeBatch and ProcessChangeBatch overrides where it downloads/uploads the file. And on the RelationalWebSyncService, you can again intercept the file in the UploadBatchFile and DownloadBatchFile operations.

     

    Saturday, March 27, 2010 12:41 PM
  • I tried, BatchSpooled event doesn't fire.

    I have seeen webSharing sample code and the proxy is just a wrapper anyway it calls "native" SyncProvider GetChangeBatch()

    changesWrapper.ChangeBatch  = this.peerProvider.GetChangeBatch(batchSize, destinationKnowledge, out changesWrapper.DataRetriever);

     

    And for some reason in my code the file is created and deleted during the GetChangeBatch() method, and i can't get access to it.

    Too much time lost on SyncFramework already, maybe i should implement my own sync system.

    Sunday, March 28, 2010 4:50 PM
  • i just copied your code and i was able to see the files on the remote provider's batching directory after the call to GetChangeBatch.

    I set different paths for the BatchingDirectory.

    Also, make sure you have enough changes for the batching to kick in. In my case, i made sure i had more than 1k worth of changes on the local provider.

    Sunday, March 28, 2010 11:44 PM
  • Hmm i thought that it may happen becouse of
    MemoryDataCacheSize=1 // so i need 1k data minimum to fill memory buffer

    But i throw away this thought, because this will result in incorrect synchronization, Sync will not happen if data is smaller than memoryCache...

    Indeed i was syncing small data, then the question is:

    What to do when data is smaller then MemoryDataCacheSize?

    Monday, March 29, 2010 12:01 PM
  • afaik, it will not batch if the changes can fit in the cache but will still sync
    • Marked as answer by Alex Burtsev Monday, August 2, 2010 5:32 AM
    Monday, March 29, 2010 8:18 PM