Friday, February 17, 2012

Improve CRM Data Load Performance by using BDD in SSIS

[UPDATE - July 7, 2017] In our most recent v9 release, we have added the support of multi-threaded writing in our CRM destination component, so this blog post is no longer useful. You can simply turn on multi-threaded writing in the destination component to achieve the same (with much greater flexibility or even better performance). Also, the number discussed in the blog post is super outdated. I am keeping the blog post here really just for reference purpose, you should not use the numbers as your benchmark baseline. [/UPDATE]

[DISCLAIMER] This blog post mainly talks about the benefits of our commercial offering, but the practice should be generally applicable to other approaches or components that you might be currently using in SSIS. It is not my intention to turn my blog into a commercial space, but I do believe this blog post would help even your are using something different, assuming that SSIS is the tool for your data integration purpose. [/DISCLAIMER]

If you have ever been engaged in any Microsoft Dynamics CRM data integration project, I am relatively sure that you have invested time and resources to tune your data integration component to its maximum possible performance so that it takes the least time to finish the data integration tasks.

This blog post shows you how to load one million records into Microsoft Dynamics CRM 2011 on-premise installation with a two-hour time range, using our product - SSIS Integration Toolkit for Microsoft Dynamics CRM, by taking advantage of the Balanced Data Distributor (BDD) component that Microsoft released to public community that works for SQL Server Integration Services (SSIS).

In case you don't know BDD component, here is a little background information about the component. BDD is a data flow transformation component that takes a single input and evenly distributes the incoming rows to one or more outputs uniformly via multithreading. The purpose of BDD component is to maximize the output performance of ETL data flow tasks. BDD can be used when your downstream pipeline component (say the destination component) is the bottleneck of the entire data flow task.

When working with Microsoft Dynamics CRM data integration, we have a perfect reason to use BDD, mainly because writing data into CRM is slow due to the nature of web service interface. In other words, In most of cases, you would find that the CRM destination component which writes data into CRM is the bottleneck of your data flow tasks. Using BDD, we can distribute incoming rows from upstream pipeline components and split them into multiple CRM destination components, so they write data into CRM simultaneously and concurrently by taking advantage of the multi-threading capability of SSIS engine.

To demonstrate the benefits of using BDD component, I first tried a single CRM destination component in my data flow task without using BDD, so the data flow writes data into CRM using a single thread. It took me 5 hours, 48 minutes to finish the load of 1,000,000 record into CRM contact entity. Here is what the data flow task looks like.
The following screenshot shows how the data flow runs using dtexec command line.

Next, I tried to use BDD and split the input into 10 outputs so that we write to CRM contact entity using 10 concurrent threads. The data flow finishing loading 1,000,000 records within 2 hours, 3 minutes. Here is what the data flow task looks like.

The following screen shots shows how the data flow runs using dtexec command line.
The improvement is about 2.84 times, it's not surprising that it's not exactly 10 times faster. 

[UPDATE - Apr 24, 2012] In order to make full use of the BDD component, you need to increase the connection limit that is imposed by Microsoft .NET framework, which is a maximum of 2 connections per host (e.g. server) for service calls as far as CRM platform is concerned. In order to overwrite this limit, you need to modify DTExec.exe.config and DtsDebugHost.exe.config files under DTS\binn folder by adding the following connectionManagement section.
     <add address="*" maxconnection="100"/>
The above configuration allows up to 100 connections per host at the same time. You may change the number based on your needs.

After making the above changes, I observed more performance improvement. I was able to load 0.9 millions of records within one hour (5 outputs and 10 outputs had almost identical performance benchmark). Note that this was done on a desktop computer, if you have a better server with performant IO and more computer power, I am relatively sure that you can load one million records within one hour. 

A few facts

  1. This is not a scientific benchmark. 
  2. My testing was conducted on a desktop computer of 4-year old which has everything installed in the single box. The following is the spec of the computer. 
    • Processor: Intel Core 2 Quad Q9550 @2.83GHz
    • Memory: 8GB PC2-6400 DDR2-SDRAM
    • Hard Disk: Seagate 7200RPM SATA 1.5Gb/s
    • Operating System: Windows 2008 R2 Server
    • Database Server: SQL Server 2008 R2
    • Microsoft Dynamics CRM Server 2011 with Rollup 6
    • SSIS Adapter: KingswaySoft SSIS Integration Toolkit for Microsoft Dynamics CRM
  3. The testing was done in an on-premise environment, your data load performance would be different if you are using CRM online or partner-hosted environment. 
  4. I have intentionally used 64-bit dtexec.exe with the hope that we can take advantage of SSIS 64-bit run-time. Controversy to what I believed, running it using 32-bit dtexec.exe is actually not slower, but 10% faster than 64-bit runtime. The reason is probably related to the cost associated with memory addressing in 64-bit runtime. 
  5. My input data is very simple, it has only two fields, firstname and lastname. When you have more fields, you would expect the data load performance to degrade in certain scale. 
  6. The number that I have got is on a desktop computer with everything installed on the same box, it should NOT be considered as a performance benchmark. Also the techniques discussed in the blog post are outdated at today's standard. I am keeping the blog post really just for reference purpose. 


  • BDD improves the data load performance by taking advantage of the multi-threading capability of SSIS engine. 
  • You should carefully choose a right number of the outputs for BDD component. It's not the case that the more the better. Depending on your servers' capacity (including processor, memory, IO system) and the network latency between your client system and CRM server, it could be 3, 5, 10, or something else for the maximized performance, which you may find out by running different tests. 
  • There are many ways that you can use to improve the data load performance, BDD is just one of the easy ways that make the data load faster, which is the main topic that we are trying to cover in this blog post.  
If you are interested in any of the data flow tasks or sample data, please feel free to let me know, so that I can send you the SSIS package.

Thanks for reading.


  1. Can I do the same for Sql Server 2012?

    1. Yes, you can. There is a new version of BDD component which supports SQL Server 2012.

  2. Yes, but I have a DTS that works very good in SQL Server 2008 RA, OS Windows Server 2008 R2 using a conditional split for simulate multi thread I can up the data to Dynamics CRM 2011 without problems, but now I'm using OS Windows Server 2012 and Sql Server 2012 and no works the multi thread, just use 2 current connections to the web services

    1. If you don't mind using Conditional Split, it should still work for SQL Server 2012.

    2. I'm still finding the way to implement multi thread, don't know why the Sql Server 2012 no catch the configuration that you indicate for the XML files configuration:

    3. You would need to make changes to a few .config files. In the case that you are running on a 64-bit system, you would need to make changes to the files under both Program Files and Program Files (x86) folders.

  3. I tried all of this using CozyRoc as my adapter of choice, and while using BDD and the setting changes recommended by you and others, it appears that with 5 or 8 destination components, the throughput to 2 network load balanced web servers was roughly the same, about 7k records per minute (give or take). I could not get it to go past that limit no matter what changes I made to buffers etc. I instead decided to leave my 8 BDD components in one task inside one Control Flow item, and then copy that control flow item 5 times. With this, and altering the source in each control flow item to break it out, I was able increase this to a whopping 18k per minute, which equated to about 1 million contact records per 55 minutes. This maxed out the SSIS server (10GB RAM, 4 Processors) nearly (could probably add one more), and made the 2 Load Balanced servers peak at 95% usage. I believe I've reached my limit with this hardware configuration.

    1. @John, I am offended when you mention CozyRoc. Just kidding, I am not. Hopefully this post was helpful, thanks for sharing your experience.

      Kind regards,
      Daniel Cai |

    2. Ah, yes sorry about that, clients choice. We are an MS Partner, using the tools they have. P.S. In tweaking the heck out of everything we have been able to hit 500 contacts per second, about 30k a minute, with turns out to about 38 minutes per million rows give or take. Fun times.

    3. Update, we are now using 8 SSIS servers, with the CRM Front end installed on all of them, and the host files having SSIS point to itself and a 20 processor db server. Each of the ssis/app combos have 16 procs. We have seen minutes of over 180,000 peek and are averaging about 155,000 contacts per minute (remember, this is really 1 base row, 1 extensionbase, 2 address base, and 2 address extensionbase records, so 155,000 x 6 or 930,000 physical rows per minute) avg = 2,583 contacts a second

    4. @John, that's a very impressive number. With CozyRoc, you don't really have a way to specifically point to a physical server, so you have to use the hosts file hack, with us, there is no need to do so. Just trying to point out a minor difference.

    5. @John, we handle many details that CozyRoc doesn't handle. If you have another project engagement, I will be more than happy to work with you for a case study or something similar. Please feel free to ping me at any time.

      Daniel Cai |

  4. Dear Daniel,

    I hope this finds you well.

    Great work on integration and migration of data.

    Kindly requesting for the data flow tasks or sample data, and SSIS package that used in your awesome blog titled " data flow tasks or sample data, please feel free to let me know, so that I can send you the SSIS package."

    Your assistance will be greatly appreciated


    Richard Silla

    1. Hi Richard,

      Thanks for reaching out.

      Unfortunately, I didn't keep the data flow tasks, and it should be a straightforward process to create them. For sample data, I used Spawner Data Generator which allows you to generate any number of sample records based on the schema you define.

      Let me know if you have any additional questions.

      Kind regards,
      Daniel Cai