Choose your path Increase your proficiency with the Dynamics 365 applications that you already use and learn more about the apps that interest you. Up your game with a learning path tailored to today's Dynamics 365 masterminds and designed to prepare you for industry-recognized Microsoft certifications.
Visit Microsoft Learn
2020 release wave 1Discover the latest updates and new features to Dynamics 365 planned through September 2020
Release overview guides and videos Release Plan | Preview 2020 Release Wave 1 TimelineWatch the 2020 Release Wave 1 virtual launch event
Ace your Dynamics 365 deployment with packaged services delivered by expert consultants. | Explore service offerings
Connect with the ISV success team on the latest roadmap, developer tool for AppSource certification, and ISV community engagements | ISV self-service portal
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance TechTalks | Customer Engagement TechTalks | Upcoming TechTalks
We have lots of SSIS packages that updates/creates records in CRM. Some entities like contact , account, customer product (our custom entity to hold all account types) , while we are migrating records facing with slow performans and high CPU in SQL machine (has 20 processor). But generally to other custom entities we do not have any slowness or problem. And to do these operation we use KingswaySoft application. Let's say I have a SSIS package called CASA , if I execute this package with 10.000 data less than 1 minute I can complete my process. But if I want to execute with 250.000 data , package should be completed in half an hour but it is taking 1.5-2 hours almost. We tried the combinations at the below but none of them gave good results for CPU and all of them gave around same time. We do not understand why with less data everything finish so quickly , but cannot handle with larger data. And we realized for every record a query sends to businessprocessflow to check there is a identified business process on it or not and that is using lots of CPU too. But when you enabled business process flow option on entity there is no way to turn back. We use CRM on -premise.
Thread : 64 , Batch : 250 : AutoAdjustBufferSize: True , Default Buffer Max Rows : 1.000.000 , CPU was 90 percent
Thread : 32 , Batch : 250 : AutoAdjustBufferSize: True , Default Buffer Max Rows : 1.000.000 , CPU was around 90 percent
Thread : 20 , Batch : 250 : AutoAdjustBufferSize: True , Default Buffer Max Rows : 1.000.000 , CPU was around 80 percent
Thread : 40 , Batch : 100 : AutoAdjustBufferSize: True , Default Buffer Max Rows : 1.000.000 , CPU was around 80 percent
Thread : 20 , Batch : 250 : AutoAdjustBufferSize:False Default Buffer Max Rows : 400.000 , DefaultBufferSize : 50.000.000 CPU was around 80 percent
There is a option called Engine thread as default it is set to 10 , we tried to decrease that to 1,2 nothing changed.
The performance may depend on many factors like network latency, the particular instance you are working with, the data volume, plugins or workflows etc. You could try out our different options and see if that helps, but please note that it would depend on your particular environment and the entity you are working with, as this could be different from one case to another. We do not have standard instructions as such, but we certainly do have some recommendations. You can take a look at our FAQ page for more suggestions:
You could also enable a proxy like Fiddler (www.telerik.com/fiddler), or use CRM trace logs to find out where the bottleneck is. For working with trace logs, you would need to do that in the CRM server-side.
Let us know if there is anything else we can help you with and do not hesitate to reach out to our Support team directly.
Unfortunately this is what a general answer to give. Of course we already check every possible actions and this is why asking to everyone to find a way out. I contacted with KingswaySoft support several times but it was not helpful at all. My question is when I try a package with 10.000 data and 250.000 data why speed and CPU changes that badly and how I can improve it. Please give up to advise so general thing. Our project is on live and just trying trying is not gaining us anything.
As a first step, I'd suggest you do more detailed performance analysis to determine which process(es) have the high CPU usage. This will help determine if the issue is at the SSIS side (either dtexec.exe or dtsdebughost.exe depending on how you run the package), or the SQL side (sqlservr.exe). Note that if the issue is at the SSIS side, then it's not easy to determine if the root cause is with KingswaySoft or the SSIS engine, or a combination of the 2
And can you split the package so that it runs for only 10000 records at a time ? E.g. if you run the package 25 times (either in serial or parallel) each for 10000 records, is this faster than running it once for 250000 records ?
Might be helpful below were the combinations we tried to get the optimum speed
And as suggested by experts, it could depend on other factors as well.
Business Applications communities