locked
OutOfMemory exception while saving GBAppraiseDemoDataSet RRS feed

  • Question

  • Frequently I must add a new database table to the data set.  When I hit save I watch painfully (usually over ten minutes or more) as all my memory on my workstation gets absorbed by devenv.exe (last count was 1,118MB!).

    It just closes the data set with a error in the 'Error List' saying an OutOfMemory exception occured in the custom tool.

    I still have some memory available (only a 100MB or so of physical out of 2GB).  What can I do?
    • Moved by Hengzhe Li Friday, April 22, 2011 7:45 AM (From:SyncFx - Microsoft Sync Framework Database Providers [ReadOnly])
    Wednesday, November 26, 2008 6:51 AM

All replies

  • Realising I posted no information about my platform or steps to recreate it, here it is :
    VS2008 SP1
    dotNet 3.5 SP1
    Code in c# flavour

    My application is a modified version of the Microsoft Sync Tutorial (the dataset name may be familiar to some).

    1. Run the sync tool against the server database to select the new database table, this step also creates a new SQLCE database for me.

    2. Make modifications to the SQL scripts (only basic mods to cater for identity columns).

    3. Load the GBAppraiseDemoDataSet (in the same project the SQLCE database is located).

    4. Delete all the tables in the data set (saving+closing here doesn't affect the result).

    5. Open the SQLCE database in the database viewer tab.

    6. Drag all tables over to the data set screen (about 43 tables now, most < 10 columns, one over 200).

    7. Close the data set, selecting to save changes.

    At this point Visual Studio freezes.  In the Task Manager I can see the memory usage slowly increasing and devenv.exe consume memory at about 100MB a minute.  10 minutes later it is not so much fun to watch as it now closes the data set with the error "OutOfMemory exception in custom tool".  Previously this long process would just close the data set without an error, now I can't compile my code as the error never clears from the Error List.

    Wednesday, November 26, 2008 10:27 PM
  • Just bumping this post.  Here is the error message:

    ----------------------------------------------------------------------------

    Error

    Description : Custom tool error: Unable to convert input xml file content to a DataSet. Exception of type 'System.OutOfMemoryException' was Thrown.

    File : C:\AdaptSource\Synchronisation\Adapt.Synchronisation.PocketPC\GBAppraiseDemoDataSet.xsd 

    Line: 1 

    Column : 1 

    Project : Adapt.Synchronisation.PocketPC
    ----------------------------------------------------------------------------

    On a good day when I reboot my PC and try again, this tool does not crash out with this error message.  It appears dependant on physical memory, making it harder to run as I am running out of programs/services to unload.
    Tuesday, December 2, 2008 12:41 AM
  • This is because you have more data that can be fitted in memory. Consider enabling batched download of data from the Server. When batching is disabled (default behavior is batching disabled) sync will load all server changes in memory. This will consume all available memory and after a certian point .NET will throw OOM exceptions.

     

    Refer to post http://www.syncguru.com/projects/SyncServicesDemoBatching.aspx for more details on how to enable batching on server side.

    Tuesday, December 2, 2008 9:03 PM
    Moderator
  • I would absolutely love to use batching, however I have given up and now waiting for the next release of sync... My post describing my failure in this area :

    http://forums.microsoft.com/Forums/ShowPost.aspx?PostID=3856715&SiteID=1

     

    I am a little confused, you will have to bare with me.

     

    I do believe implimenting batching is unrelated to my situation which is where I am confused.  The error is not in my application, the error happens @ Design Time in visual studio.  These are my steps :

     

    1. In visual studio I open the GBAppraiseDemoDataSet.xsd which displays a graphical list of tables.

     

    2. I delete all the tables as they contain an old list of tables and columns.

     

    3. In visual studio I load the SQLCE database in Server Explorer.

     

    4. I hi-light all the tables in Server Explorer, then drag them onto the GBAppraiseDemoDataSet.xsd screen, which now shows the correct list of tables and columns.

     

    5.  I click the 'X' button on the screen, selecting to save all changes.

    *** This is where my PC locks for ten minutes and the memory slowly gobbles up to full... CRASH ***

     

    This is a severe problem for me as I have had to tell my usually happy coding coleagues to stop making table changes as the sync application will not compile them.

    Wednesday, December 3, 2008 11:31 PM
  •  

    Seems a non-sync issue. Seems more like VS tooling is trying to load/save table data when you close the designer. I have no idea about this behavior though. Would recommend posting the question to VS forum.
    Thursday, December 4, 2008 7:45 PM
    Moderator