locked
Minimizing web service calls vs minimizing workflow execution - best practices / better ways? RRS feed

  • Question

  • Howdy peeps, I'm not a CRM developer but I am trying to do some architectural work regarding CRM and optimizing integration with other systems.

    At my company we push some data into CRM from other systems, such as our ERP. We of course use the CRM web services to write the data into CRM.

    Sometimes (usually) there are several fields we are integrating. For instance, we might want to push the values for fields "A" and "B" in the account (organisation) entity in CRM based on values in the ERP.

    Now, let's say we have workflow_A that triggers when field "A" is updated, workflow_B that triggers when field "B" is updated. My understanding here is that "updated" does not mean the value is necessarily changed in CRM, just that we push a value to the service. If the new value is the same as the old value the workflow will still trigger. If my understanding here is not correct then please let me know.

    So, there are two ways we could do this:

    - Call the web service at most once, pushing the values for both "A" and "B" if *either* of them need to be updated, or
    - Call the web service at most twice, push "A" if it needs to be updated and, independently, push "B" if it needs to be updated.

    In the former case we make fewer web service calls but may unnecessarily cause workflows to execute. For example, if the value for A has changed but B has not, workflow_B will still run. In the latter case we make more web service calls, but workflows will only kick off if they really need to.

    For two fields this may not seem like a very important decision... but as it happens we have about 10 extension attributes with associated update workflows that are being pushed from various other systems.

    Would anyone like to provide their opinion on which option we should take? Are there any Microsoft recommendations on this specific issue? I wasn't able to find anything but it's rather difficult to google such a specific question without getting thousands of hits for general workflow development.





    • Edited by allmhuran Monday, June 15, 2015 5:14 AM
    Monday, June 15, 2015 5:13 AM

Answers

  • I'd say the best practice would be to only send a field value when the new value is different from the old value, so the second case would be better practice. The main reason for suggesting this is to avoid unexpected consequences if you have workflows or plugins that expect the new value to differ from the old one.

    There is another option. Even if you are using tools that generate the Crm web service calls for you (which prevent you excluding attributes that haven't changed), it is possible to develop a plugin that could check old and new values. This could remove attributes from an update if they haven't changed, and thus prevent unnecessary workflow execution.


    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

    • Proposed as answer by Chris_Harrington Tuesday, June 16, 2015 12:09 PM
    • Marked as answer by allmhuran Thursday, June 18, 2015 1:29 AM
    Tuesday, June 16, 2015 9:10 AM
    Moderator
  • Thanks for the reply.

    Ultimately, we have quite a few issues with integrating data with CRM via the web services, with WCF crashing out intermittently for reasons unknown (intermittent "An unsecured or incorrectly secured fault was received from the other party. See the inner FaultException for the fault code and detail." errors, which then cause all subsequent rows to fail due to the service channel being in a faulted state), my hunch being that it is load related.

    The next rollup due to be deployed will provide the bulk loading service for CRM 2011, which might help a bit, which would preclude using the SSIS plugin component, so I've decided that the best solution here is to wait for the rollup to be deployed, get rid of the SSIS plugin, batch up rows with the same changed columns, and then call the new batch service using hand-crafted script in SSIS to provide better control over the whole process.

    As a 15 year SQL guy who is used to an environment where I can modify tens of thousands of rows of data in mere milliseconds, I must say that I really hate service oriented architectures :)
    • Marked as answer by allmhuran Thursday, June 18, 2015 1:29 AM
    Thursday, June 18, 2015 1:17 AM

All replies

  • In your integration code - if you know which fields have changed then simply add just these to the update object, then you will only get the minimum number of workflows triggered for your update scenario...
    Monday, June 15, 2015 7:52 AM
  • That would be ideal, unfortunately the interface doesn't provide this capability - the interface in this case being a plugin for SSIS that provides CRM service functionality (Task Factory). The fields to be populated must be set at design time, not run time. This is also true of SSIS more generally... columns in the data flow cannot generally be changed at run time.
    • Edited by allmhuran Tuesday, June 16, 2015 12:23 AM
    Tuesday, June 16, 2015 12:22 AM
  • I'd say the best practice would be to only send a field value when the new value is different from the old value, so the second case would be better practice. The main reason for suggesting this is to avoid unexpected consequences if you have workflows or plugins that expect the new value to differ from the old one.

    There is another option. Even if you are using tools that generate the Crm web service calls for you (which prevent you excluding attributes that haven't changed), it is possible to develop a plugin that could check old and new values. This could remove attributes from an update if they haven't changed, and thus prevent unnecessary workflow execution.


    Microsoft CRM MVP - http://mscrmuk.blogspot.com/ http://www.excitation.co.uk

    • Proposed as answer by Chris_Harrington Tuesday, June 16, 2015 12:09 PM
    • Marked as answer by allmhuran Thursday, June 18, 2015 1:29 AM
    Tuesday, June 16, 2015 9:10 AM
    Moderator
  • Thanks for the reply.

    Ultimately, we have quite a few issues with integrating data with CRM via the web services, with WCF crashing out intermittently for reasons unknown (intermittent "An unsecured or incorrectly secured fault was received from the other party. See the inner FaultException for the fault code and detail." errors, which then cause all subsequent rows to fail due to the service channel being in a faulted state), my hunch being that it is load related.

    The next rollup due to be deployed will provide the bulk loading service for CRM 2011, which might help a bit, which would preclude using the SSIS plugin component, so I've decided that the best solution here is to wait for the rollup to be deployed, get rid of the SSIS plugin, batch up rows with the same changed columns, and then call the new batch service using hand-crafted script in SSIS to provide better control over the whole process.

    As a 15 year SQL guy who is used to an environment where I can modify tens of thousands of rows of data in mere milliseconds, I must say that I really hate service oriented architectures :)
    • Marked as answer by allmhuran Thursday, June 18, 2015 1:29 AM
    Thursday, June 18, 2015 1:17 AM
  • Check into Kingswaysoft plugin for SSIS, this may be suitable for you.
    Thursday, June 18, 2015 6:23 AM