locked
Batching with Datetime as the tracking type? RRS feed

  • Question

  • Is there i demo for writing the batching code when you have a datetime as the tracking type. I can only fing code exampels where they use timestamp. The sync designer added datetime for the tracking columns.

    I need to see how to implement the Stored Procedure and the SQLComand in the DbServerSyncProvider class.

     

    /Christer

    • Moved by Hengzhe Li Friday, April 22, 2011 2:49 AM (From:SyncFx - Microsoft Sync Framework Database Providers [ReadOnly])
    Wednesday, September 24, 2008 9:18 PM

All replies

  • How did you go with batching.  I stlil cannot make the thing work!

     

    http://forums.microsoft.com/Forums/ShowPost.aspx?PostID=3856715&SiteID=1

     

    Thursday, October 9, 2008 2:02 AM
  • Hi, just wondering the same thing. I'm trying to implement batching using datetime in my tracking columns (SQL Server 2005 back-end). However, all the examples that I've seen use timestamps (e.g. http://msdn.microsoft.com/en-us/library/bb902828.aspx)

    e.g. code copied from linked MSDN article:
    CREATE PROCEDURE usp_GetNewBatchAnchor (
    	@sync_last_received_anchor timestamp, 
    	@sync_batch_size bigint,
    	@sync_max_received_anchor timestamp out,
    	@sync_new_received_anchor timestamp out,            
    	@sync_batch_count int output)            
    AS            
           -- Set a default batch size if a valid one is not passed in.
           IF  @sync_batch_size IS NULL OR @sync_batch_size <= 0
    	     SET @sync_batch_size = 1000    
    
    	   -- Before selecting the first batch of changes,
    	   -- set the maximum anchor value for this synchronization session.
           -- After the first time that this procedure is called, 
    	   -- Synchronization Services passes a value for @sync_max_received_anchor
           -- to the procedure. Batches of changes are synchronized until this 
           -- value is reached.
           IF @sync_max_received_anchor IS NULL
             SELECT  @sync_max_received_anchor = MIN_ACTIVE_ROWVERSION() - 1
           
           -- If this is the first synchronization session for a database,
           -- get the lowest timestamp value from the tables. By default,
           -- Synchronization Services uses a value of 0 for @sync_last_received_anchor
           -- on the first synchronization. If you do not set @sync_last_received_anchor,
           -- this can cause empty batches to be downloaded until the lowest
           -- timestamp value is reached.
           IF @sync_last_received_anchor IS NULL OR @sync_last_received_anchor = 0
           BEGIN
                    
    		SELECT @sync_last_received_anchor = MIN(TimestampCol) FROM (
    		  SELECT MIN(UpdateTimestamp) AS TimestampCol FROM Sales.Customer
    		  UNION
    		  SELECT MIN(InsertTimestamp) AS TimestampCol FROM Sales.Customer
    		  UNION
    		  SELECT MIN(UpdateTimestamp) AS TimestampCol FROM Sales.OrderHeader
    		  UNION
    		  SELECT MIN(InsertTimestamp) AS TimestampCol FROM Sales.OrderHeader
    		) MinTimestamp	
           
    		SET @sync_new_received_anchor = @sync_last_received_anchor + @sync_batch_size
    
    		-- Determine how many batches are required during the initial synchronization.
    		IF @sync_batch_count <= 0
    		  SET @sync_batch_count = ((@sync_max_received_anchor / @sync_batch_size) - (@sync_last_received_anchor /  @sync_batch_size))
    
    		END
    
           ELSE
           BEGIN
    
            SET @sync_new_received_anchor = @sync_last_received_anchor + @sync_batch_size
    
            -- Determine how many batches are required during subsequent synchronizations.
    		IF @sync_batch_count <= 0
              SET @sync_batch_count = ((@sync_max_received_anchor / @sync_batch_size) - (@sync_new_received_anchor /  @sync_batch_size)) + 1  
           
    	   END
    
           -- Check whether this is the last batch.      
           IF @sync_new_received_anchor >= @sync_max_received_anchor
           BEGIN
    
             SET @sync_new_received_anchor = @sync_max_received_anchor        
             IF @sync_batch_count <= 0
               SET @sync_batch_count = 1
    
           END
    GO
    

    Is there any equivalent code available using datetime instead of timestamp (I've got a mammoth web app & mobile client that are both sitting on top of the same database, so I want to avoid changing data-types in the underlying tables if at all possible (it has taken me a lot of late nights to get to this stage and i don't fancy undoing weeks of work if i can avoid it)

    The above code seems to use the timestamp as integer and performs addition using the batchsize (which obviously can't be done in the same way with datetime)
    Wednesday, July 8, 2009 2:08 PM
  • Hi,

    I tried it several times like many others here in the board,
    and I guess that there isn´t any solution for it.

    If you find one, please let me know.


    IMHO it´s very unusefull, that beginners tutorials use DateTime without any reason,
    because Timestamps seems to be much more usefull for the Sync Framework.



    Regards,
    Martin
    Thursday, July 9, 2009 8:23 AM
  • I agree - i think it would help if the VS designer didn't allow you to use datetime as tracking columns, at least then you'd be forced from the outset down a route that would actually work with batching (which i just assumed would work with any datatype and batching is only necessary because sync services for devices is incredibly slow) ... now, i've got to undo weeks of work changing datatypes in all my tables, rewriting front-end and back-end in two different large applications just to get the same point i am now. very frustrating and will cost me a lot of money in unpaid work as i can't invoice my client until the app is finished :(
    • Edited by SunHunter Thursday, July 9, 2009 7:51 PM
    Thursday, July 9, 2009 7:49 PM
  • FuI agree - i think it would help if the VS designer didn't allow you to use datetime as tracking columns, at least then you'd be forced from the outset down a route that would actually work with batching (which i just assumed would work with any datatype and batching is only necessary because sync services for devices is incredibly slow) ... now, i've got to undo weeks of work changing datatypes in all my tables, rewriting front-end and back-end in two different large applications just to get the same point i am now. very frustrating and will cost me a lot of money in unpaid work as i can't invoice my client until the app is finished :(
    FullACK!

    100% the same here!
    Friday, July 10, 2009 7:14 AM
  • Hi macap, did you manage to get batching working in the end? I changed all my tables to use timestamps but ran into problems with the batch anchor example in MSDN maybe having a couple of issues. I posted on a blog that I found as somebody else seems to have solved the same problem . I think my issue is that I am using multiple sync groups (as well as filtering), so I am not sure if that is affecting the batching logic in the stored procedure (I sort of assumed it wouldn't matter if filtering or multiple sync groups were being used).

    I also noticed a post of yours trying to use SqlCeResultSet inserts within the ApplyingChanges event handler .

    Implementing this tweak sort of promises a 10x perf benefit, so I am keen to get this working; but I am running into the same issue as you trying to do identity inserts which is causing the same error that you experienced. So, I can't get this working either.

    Anyway, I'm just about ready to give up with sync services, it's been a nightmare from start to finish to get anything working.



    Monday, July 13, 2009 12:30 PM
  • Hi SunHunter,

    I´m sorry, but I cannot help you.
    I´ve exactly the same 2 problems you mentioned, but at the moment I do not try to solve them,
    because I do not have the time to do experiments.


    My sync-solution is working without batching. So it´s slow/ not fast, but it´s working.
    "Never change a running system" :-)
    But it is also not very big. I use only 5 tables each with 3000 rows and one ca. 15000.

    Maybe you have more problems because you have bigger tables?


    I will try it again, when I have a bit more time.
    During this I hope, that my problems are maybe solved by itself if Microsoft gives out a proper working Ms Sync Framework.



    Regards,

    Martin


    PS:
    You should not use the code generated by Visual Studio because it generates always a sync of the whole table with all columns,
    but must of the time you only need a small subset of the database columns on your mobile device.
    So you can use the generated code, but should remove all unecessary columns. This saves much time, memory and storage space.
    Monday, July 13, 2009 12:48 PM
  • Hi, my tables are about the same size as yours (probably smaller, but more of them) and I'm only downloading a subset of columns, but I'm still not happy with the performance. It takes forever to insert the rows on the client - a major perf bottleneck - which is why I wanted to get that SqlCeResultSet inserts tweak working as that could help to boost performance considerably.

    Batching would also help to reduce memory usage on the device if I could get that working as well.

    Good luck with your project, hope it goes better than mine...
    • Edited by SunHunter Monday, July 13, 2009 1:23 PM
    Monday, July 13, 2009 1:04 PM
  • How many SyncGroups do you have?

    Maybe it helps, if you divide the tables into smaller SyncGroups.

    Good luck for your project too!



    Regards,

    Martin

    Monday, July 13, 2009 3:51 PM
  • I've got 21 tables split into 7 sync groups and pretty evenly sized. There is one table with about 1000 rows (which I'd like to batch to see if it improves speed at all and also use that 'insert' tweak in the other thread), but most of the other tables have 1 to 100 rows. Maybe the overhead is in having quite a few tables even though they are very small.
    Monday, July 13, 2009 5:34 PM
  • Your 1000 row-Table.

    Is it a download from server to client/mobile device, or an insert from device to server-db?

    What is "slow" for you?
    About which time are you talking?

    3 tables per sync-group would be ok.
    But try to put the 1000rows-table into an own sync-group. Only to see what happens ;-)

    Do you also only sync the rows you have to use (I already wrote that).
    And are you sure, that you have to sync all tables?
    In my scenario nearly 70% of all tables are for download-only/snapshot.


    Regards,
    Martin
    Tuesday, July 14, 2009 7:40 AM