Considerations while Importing Data in Dynamics CRM. RRS feed

  • Question

  • Hi,


    I have a scenario where there are already 70 Million+ records in an Entity.

    Client is expecting us to create around 1 Million record on 1st of month and then update those records as and when the updates come in through a batch import.


    The records created will be child record (1:N) for a parent record and there can be a scenario where there is a pre-existing record for the given month.


    The flow will look something like below:


    1. Search if there is any child record for given month. If yes, Ignore, else create. Total Active Parent records - 1  million. Total records in parent entity 5 million.
    2.  Every day batch job: Search the child record created in step 1. Update the Record.  Daily inflow around 10,000 to 50,000.


    I am very concerned about implementing it this way and have tried to suggest workarounds but client is stuck to this approach as their previous system had something similar.


    Can you suggest what can be the optimization consideration if I need to implement this and what will be the best approach to implement it.



    Abhinav Ranjan

    Wednesday, June 10, 2015 2:03 AM

All replies

  • If you are going to make a custom code to do this I would try and filter out the records you are not going to process f.e. by using a sql query or something (it is faster than calling the service to check it), then you could import f.e. 10 or 100 at a time in a single call in your code (If I remember correct there was a bulk import).

    1. Filter out all record you are not going to import
    2. Insert all the records you intend to import in bulks.

    Halldór Jóhannsson

    Wednesday, June 10, 2015 9:54 AM
  • Halldór has a good plan here.  I would do all the heavy lifting to determine what records are inserts, and what are updates, offline in a SQL server.  It will be much faster that way.  Then after that work is done, then make your insert/update calls to the CRM web service.  Use similar techniques to an ETL load to a Data Warehouse, which you can research with some Bing/Google searches.
    Wednesday, June 10, 2015 1:40 PM
  • Halldor, Chris Thanks a lot for the suggestions. 

    So now the plan is to

    1. Create a SSIS job which queries filtered views and segregates the data that need to be Updated (Captures GUID) and Created in a table.

    2. Based on table next part does Bulk execution (ExecuteMultipleRequest).

    Thursday, June 11, 2015 1:02 AM
  • I think you need to consider whether you need that granularity of data in CRM. Bear in mind you will have problems rendering charts that aggregate more than 50,000 records. Many times the best solution is to have some integration with a backend database that does periodic (daily?) aggregation/summarising of the base data. In one implementation I did, granular data was important for the first month, but before that the data could be summarised so we had an overnight process to remove old data and replace by summary before adding yesterdays data. Thanks, Chris
    • Proposed as answer by Chris_Cohen Saturday, June 13, 2015 4:41 PM
    Saturday, June 13, 2015 4:37 PM