none
SQL Express or CE database on the client and custom provider on the server - how to get it working? RRS feed

  • Question

  • Hi,

    I'm trying to get this proof of concept going, but I wonder if it is feasable:
    On the client (laptop with Windows Xp/Vista/7) we want to have a Sql database (CE or Express) which we want to synchronize with several data sources on the server. Since some of these data sources are not SQL Server and I'm not allowed to access the db directly, I need to build a custom provider for the server.

    My initial architecture was to build a single custom provider that would run only on the server and aggregate all the required data on the server and pass this back to the SqlSyncProvider on the client. (The client using the N-Tier architecture and a ProviderProxy)

    However:

    1) The SqlMetadataStore cannot be used as the metadatastore on the server, because it does not support the IdFormats of the SqlSyncProvider, is this correct, or am I missing something? I get the following exception when calling the InitializeReplicaMetadata method.

    System.ArgumentOutOfRangeException was unhandled
      Message="Max length cannot exceed 8000 bytes.\r\nParameter name: idFormats"
      Source="Microsoft.Synchronization"
      ParamName="idFormats"

    So I'll need to implement a custom metadatastore, right?

    2) the GetChangeBatch/ProcessChangeBatch methods of the SqlSyncProvider use an object of type DbSyncContext as the changeDataRetriever object.
    I assume this is the object that I need to populate or use in my custom provider to be able to interop with the SqlSyncProvider.

    3) Since one of the datastores is a Sql Server 2008, would I be able to delegate some of the work to the SqlSyncProvider?(provided I call it over a webservice)
    I would imagine I will need to modify the ChangeBatch object, changeDataRetriever and sessionStatistics to contain only the data that is relevant to the datastore.

    4) Is an arichitecture where I have a provider for each datastore and make the client sync its data with each specific provider (in this case executing the 'Sync' 3 times) a better choice?
     If so, would it be possible to do the synchronizations in parallel, or do they have to be performed sequentially.

    Thanks for your comments,
    Rudi

    Thursday, December 17, 2009 12:18 PM

Answers

  • I have basically abandoned the route of writing a Standard Custom Provider, as it does not seem feasable to allow it to work with a Database collaboration provider. I can't figure out how the Knowledge is being built up or used. In the database there is all of the metadata per table and per scope; which is just making is a bit too complex.

    I'm currently hacking around with the DbSyncProvider to attempt to get my scenario working. I'm basically configuring it with my own set of DbXxx classes to get control of the commands the provider wants to execute. The implementation of the XxxDbCommand.ExecuteNonquery or ExecuteDbDataReader then call web services etc. to select/insert/update/delete the required data.

    • Marked as answer by Rudi - Euricom Wednesday, January 20, 2010 7:46 AM
    Monday, January 11, 2010 1:38 PM

All replies

  • Rudi,

    SInce you have disparate data sources on the Server, could you consolidate that data and put them in a SQL Server. Then you can sync your client with this SQL Server using the out of box SQL providers instead of generating one yourselves.
    It will also depend on the data that is in these separate sources and if they collide.conflict with each others, or it is just a straight port of data alebit with some transformations etc.
    This posting is provided AS IS with no warranties, and confers no rights
    Thursday, December 24, 2009 6:33 PM
  • Hi,

    Consolidating the data sources into a single SQL Server database is out of the question. This is existing data, from existing applications.

    As the data does not collide with each other, I should be able to separate the synchronizations. so using point 4) from my first post. As I understand the architecture of MSF right know, doing the synchronizations in parallel should then be possible.

    I'm currently going with a custom provider, based on the DbServerSyncProvider, for each data source.
    The offline scenario seems to be a better match for what we need, and basing the custom providers on the DbServerSyncProvider also seems to be easier to manage the 'meta-data'. As in this case I only need to store some extra timestamps in the database. I could even use the Sql Server 2008 Change Tracking features with this architecture for some of the data sources.

    I have noticed however that the MSF casts some objects directly to the DbXxx abstract classes rather than casting them to IDbXxx interfaces. (eg. DbServerSyncProvider.SetSessionParameters; casts to IDbCommand.Parameters to DbParameter) This has made my task more difficult as I do not have any need to derive from the DbXxx classes an I need to serialize these objects.)

    Monday, December 28, 2009 1:05 PM
  • Hello,
    Did you run into any blocking issues while writing your custom provider?
    Friday, January 8, 2010 11:07 PM
    Answerer
  • I have basically abandoned the route of writing a Standard Custom Provider, as it does not seem feasable to allow it to work with a Database collaboration provider. I can't figure out how the Knowledge is being built up or used. In the database there is all of the metadata per table and per scope; which is just making is a bit too complex.

    I'm currently hacking around with the DbSyncProvider to attempt to get my scenario working. I'm basically configuring it with my own set of DbXxx classes to get control of the commands the provider wants to execute. The implementation of the XxxDbCommand.ExecuteNonquery or ExecuteDbDataReader then call web services etc. to select/insert/update/delete the required data.

    • Marked as answer by Rudi - Euricom Wednesday, January 20, 2010 7:46 AM
    Monday, January 11, 2010 1:38 PM