none
File Sync to Azure using WCF RRS feed

  • Question

  • Hi all,

    I have been looking at the code sample "Synchronizing Files to Windows Azure Storage".  I have implemented the code and it works great.  However, for my application, I don't want a direct connection to Azure storage for my clients to use -- I need to implement a WCF connection.  The workflow I need is:

    Client files -> FileSyncProvider -> SyncOrchestrator -> AzureBlobSyncProviderProxy -> WCF to Azure service -> AzureBlobSyncProvider -> Azure Blob Storage

    The parts not included in the sample "Synchronizing Files to Windows Azure Storage" are the AzureBlobSyncProviderProxy and the WCF service.  So that is where I need help -- can anyone give tips on constructing the WCF service and service contract, along with the proxy provider?  In trying to figure this out, I have already looked at "Sync101 with Remote Change Application over WCF", which doesn't help too much because it implements a KnowledgeSyncProvider rather than the FullEnumerationSimpleSyncProvider needed in the azure sample (they have different methods).

    Monday, October 11, 2010 11:28 AM

Answers

  • I just wanted to close out this thread by saying "I GOT IT!!!".  It was hard to get the whole thing to work, but I finally managed it.  If you are looking to do this, I wish you luck - it is going to take some time.
    • Marked as answer by Brian_I Friday, October 29, 2010 7:14 PM
    Friday, October 29, 2010 7:14 PM

All replies

  • You are right, Simple Providers don't really work for this scenario.  

    Options I can think of:

    -Write a KnowledgeSyncProvider (KSP)

    -Make your WCF interface on the other side of the Simple Provider (eg your WCF interface would be Create, Update, Delete, GetAll or something like that which would be called by your Simple Provider)

    -Create a dual sync session - You have a fake proxy locally which implements KSP and just passes those calls to WCF.  On your WCF service you have another fake KSP which now acts as the source in another sync session and your Simple provider is the destination.  Your service KSP "maps" the calls as a destination provider to a source provider. eg  Your ProcessChangeBatch calls become GetChangeBatch on the service.

     

    Admittedly, none of these solutions are that great.

     

    -Jesse

    Monday, October 11, 2010 8:27 PM
  • New Question: Any suggestions on building a KnowledgeSyncProvider for Azure Blob Storage?

    Jesse,

    I appreciate the response.  I spent most of the day reading through msdn and the code samples trying to decide on my best option.  You confirmed my suspicion that I should go ahead and abandon the simple provider approach (using FullEnumerationSimpleSyncProvider).  The more I thought about the RCA approach in "Sync101 with Remote Change Application over WCF", the more I liked it, so I think creating a KnowledgeSyncProvider and following the pattern in that sample is the best option.

    Setting up WCF with a KnowledgeSyncProvider for Azure blob storage wouldn't be too hard given the Sync101 sample already available.  Moreover, there is a lot of duplication in the code for the providers in "Sync101 with Remote Change Application over WCF" and "Synchronizing Files to Windows Azure Storage", so I am holding out hope that someone knows how to make a KnowledgeSyncProvider for Azure Blob Storage.

    I have to believe that this is a somewhat common scenario -- anyone with any concern over syncronizing their Azure Blob Storage with encryption and/or authentication will have to consider this.

    Monday, October 11, 2010 9:44 PM
  • Hi Brian,

     

    We don't have a sample for this, but like you mentioned, there is a lot of common code between those two samples - you essentially need to combine the two.  Basically what you need to do is rip out all the operations implemented in "MySimpleDataStore" in the RCA sample and replace them with the basic operation implemented for the BlobStorage Simple Provider.  You can adjust the metadata store fields as needed and store the metadata file a blob in the blob store, potentially.

     

    Make sense?

    -Jesse

    Wednesday, October 13, 2010 6:22 AM
  • Jesse,

    Again, I appreciate the response.  I understand exactly what you are saying.  In fact, I just did some testing (upload only from file directory to Azure Blob Storage), and it worked, so I am well on my way to finishing a KnowledgeSyncProvider for Azure Blob Storage :)

    I have one question regarding the metadata store.  I have been trying to decide whether to store it on the client computer or in the cloud.  I know SqlMetadataStore implements a SQL CE database -- any chance I could somehow modify that to use SQL Azure? 

    Let me be clear with my question:  Is it possible to use a SQL Azure database as the metadata store for Azure Blob Storage?  If so (I suspect it is), how hard would it be (i.e. can I just point SqlMetadataStore to a different SQL database)?

    Wednesday, October 13, 2010 7:16 PM
  • I just wanted to close out this thread by saying "I GOT IT!!!".  It was hard to get the whole thing to work, but I finally managed it.  If you are looking to do this, I wish you luck - it is going to take some time.
    • Marked as answer by Brian_I Friday, October 29, 2010 7:14 PM
    Friday, October 29, 2010 7:14 PM
  • Hi Brian,

    Any chance you might post at least the issues you had to deal with for file blog sync to azure? I've modified the version published on the sync framework blog and it works, but seems like it could be a lot better.

    Thanks


    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Friday, October 29, 2010 7:16 PM
  • I wrote a long reply yesterday, but it didn't get saved, and I don't really want to write it again.  Needless to say, there were a ton of problems combining Sync, WCF, and Azure - it isn't for the faint at heart.  I also spent about two weeks completely debugging the whole thing (based on the fact that development for Azure is fairly different from deployment on Azure using all these pieces - there are a lot of weird and completely misleading errors that crop up).  In fact, today I just finished verifying that all the conflict handling scenarios were properly handled.  It just takes a lot of time.
    Wednesday, November 3, 2010 1:46 AM