Personalized Community is here!
Quickly customize your community to find the content you seek.
Choose your path Increase your proficiency with the Dynamics 365 applications that you already use and learn more about the apps that interest you. Up your game with a learning path tailored to today's Dynamics 365 masterminds and designed to prepare you for industry-recognized Microsoft certifications.
Visit Microsoft Learn
2021 Release Wave 1Discover the latest updates and new features to Dynamics 365 planned April 2021 through September 2021.
Release overview guides and videos Release Plan | Preview 2021 Release Wave 1 Timeline
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance and Operations TechTalks | Customer Engagement TechTalks | Upcoming TechTalks | All TechTalks
You are getting close to your “Go Live Date”. You have gone through multiple iterations of testing processes and you are now in the process of testing/building data migration/integration process. You have to move millions of rows quickly; in addition, you have a daily integration that needs to move high number of rows in a very short time frame. You are able push some data but not fast enough.
If you are still following, you might have encountered this issue in the past or will encounter it in the future, maybe you had a similar issue as well. In this blog I’ll cover a scenario where we need to move data, using ADF (Azure Data Factory) and how to optimize the data transfer. I’ll also explain the impact of different configurations such as batch size and parallelism level. I do hope that you will find this blog helpful.
We need to move the data from one instance to another, the data that we are moving is a custom table. This is a very simple table, not too many fields no plugins are getting triggered. The amount of data that we need to move is about 36K rows. We need to move it as fast as we can.
If we are moving row by row, and each row takes 1 second to process, it may take up to10 hours (36k seconds) to process all the rows. Luckily, Dynamics 365 support batching processes and parallelism so we can speed up the process. When we are planning to utizlie parallelism we should also look at the Power Platform throttling or API calls limitations.
Service Protection API Limits
Requests limits and allocations
To figure out what is the best configuration to move data I researched with different scenarios. In the screenshots below, you can see Azure Data factory configuration with the Dynamics 365 connector.
In the Sink tab, you can configure the batch size and max concurrent connections:
In the Setting tab, you can configure the degree of copy parallelism:
In case that you are not familiar with Azure Data Factory, here is a useful link:
Copy data from and to Dynamics 365 (Common Data Service) or Dynamics CRM by using Azure Data Factory
Azure Data Factory documentation
I ran multiple scenarios where I moved the data, you can see the results in the following table:
Create rows per second
Consider the scenario where I deal with a larger data set, can I use the same configuration? Do I need to modify it? Please see the following scenarios:
Please note that I oversimplify the process here, things to consider when we are moving data:
Business Applications communities