Dynamics CRM Custom WebService error on concurrent responses RRS feed

  • Question

  • I developed a custom CRM WebService for a client. The webservice uses CRM SDK to create and update contacts. The webservice works just fine, but it is not fast enough for the project requirements, so the webservice is executed concurrently 4 times, everything works just fine, every record is updated just as it should be updated until there's a moment when the following error is thrown while executing the webservice:

    Error: COM error object information is available. Source: "ADODB.Recordset" error code: 0x800A0BCD Description: "Either BOF or EOF is True, or the current record has been deleted. Requested operation requires a current record."

    Apparently this is related to CRM connection with SQL Server, I haven't found any solution anywhere.

    The webservice has only one method that requires a custom XML containing attributes to update.

    Tuesday, October 7, 2014 10:54 PM

All replies

  • Hi,

     Have you tried to use ExecuteMultiple to update records in batch?


    There was another thread where another person reported the issue when a limit is reached (10K records in his case). I am not able to find the thread.

    If plugins are registered on the contact, workflow expansion tasks may be affecting the updates. Please try to see if this helps.


    Salesforce has a batch limit of 200, see if you can also upload in smaller batches without hitting the connection limit.

    Last, please see if a commercial bulk upload tools works (like the http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm).

    If that also can't handle the data quick enough, you will have to upgrade your hardware/ split the webserver and background processing server, if you are on virtual, convert to dedicated physical server etc.



    Wednesday, October 8, 2014 7:15 AM
  • I think the following question I posted a few days ago might be what Jithesh is referring to.  You can review is comments on that posting here.


    In regards to that posting, I can tell you what I ran into. I have a web service where I am syncing data between CRM and a legacy application.  I have hundreds-of thousands of records that the service keeps in sync and process as many as 10,000 records per hour and could have a dozen threads running at any give time.  A few weeks ago I ran into something similar to what you might be running to into.  I found that some of the threads seem to lose their connection to CRM and crash.  But other threads run fine and never have any issues. 

    I was able to clean up a few things and it runs much better now.  I still get similar errors here and there, but my clean up certainly helped.  I suspect I still need to clean up some more code to completely resolve this issue.  It might be worth scanning the following article to see if it may be related to CRM terminating it's connection for long running processes.  I applied the suggestions here to my code and it helped.   


    Jon Gregory Rothlander

    Wednesday, October 8, 2014 9:09 PM
  • Hello!, Thanks for the reply, I'm going for the multiple request approaches..

    I have an importan question, what do you have to change in the web service to make it available for concurrent connections or thread safe (if it's even the same), that would really help me for future projects.

    On the other hand, I will check about the connection aborted issue, it may be related as you say.

    I will that issue and see how it goes and inform you when I prove it.

    Thanks again

    Wednesday, October 8, 2014 10:55 PM
  • You question about making the service thread safe... is an excellent question and something I am working through as well.  I can explain my approach and how I am attempting this.  I'm not 100% sure my approach is correct, but I am working through it. It would be wonderful to find others doing similar things and share what we have found that works well and what is problematic.

    For me, my web service just talks to a service class and 99% of my logic is in the service class, which is just a C# class library.  That helps me with my unit testing and I think the design is much cleaner. In that service class library I have a classes that contain what I am calling my threaded classes.  So my web service nor my service class library are being called multiple times, but only once. Well, actually 8 times per hour, once per primary entity that I am syncing.  But the calling of web service is not where the threading comes into play. 

    The web service calls the service class library to execute a process that pulls a list of keys of entities that have been recently updated in the legacy system (updates after last sync cycle), which is typically a few thousand or more, and sometimes as many as ten-thousand records.  I then divide those records into logical sets based on the count, the branch (we have about 50 across the US), and I have a CRM configuration entity where I set the number of threads per branch and others things.  So I can adjust and throttle how this works.  Once the number of records and number of threads are determined, the logic creates a thread and passes each set of keys to the thread and executes it. I'm playing with just how to throttle the number of records per thread, thread pooling, and using tasks instead of threads, and the number of threads, etc. as I have a LOT of records to sync and I suspect that I can easily overload CRM and our servers.  I have overloaded the servers in one of our DEV environments, as it is a much smaller server.  But so far everything is running well in production. 

    It is the service class library that spawns threads of the threaded classes.  The threaded classes are doing the low-level sync between the legacy application and CRM. So by the time I spawn a thread, it is only reading from the legacy database and making a call out to CRM through the SDK.  What I found is that CRM SDK seems to handle many threads all updating creating/updating records in CRM.  Although each thread seems to run pretty slow, I've been able to spawn off as many as 100-threads at a time without it having any issues.  However, I've pulled that back a great deal, as I don't want to push it to its limits.

    In regards to thread-safe, I been working directly with non-pool threads in my sync classes, but I also have other smaller classes that I am spawning as tasks.  I suspect that my non-pooled design has issues in this regards.  To work around that I have spent a lot of time setting up IDisposable in all of my classes and I perform a .Dispose cascade through all of my classes to handle my own garbage collection... and I suppress the garbage collector. My goal here is to free up memory resources as fast as possible and not depend on .Net GC, but I am still working through  the design.  Currently when I monitor the memory profile, it seems to run well and memory seems well managed.  But I am not 100% sure about the design yet.  I suspect I have many issues that I have yet to resolve.     

    Jon Gregory Rothlander

    Thursday, October 9, 2014 3:04 PM