How to embed Power BI reports
By default, Dynamics 365 for Finance and Operations doesn’t allow users to report on external data. This video describes how to personalize your application work space with reports designed for your organization that are hosted on Power BI
2019 release wave 2 Discover the latest updates to Dynamics 365Release overview guides and videos Release Plan | Early Access Availability
Ace your Dynamics 365 deployment with packaged services delivered by expert consultants. | Explore service offerings
Connect with the ISV success team on the latest roadmap, developer tool for AppSource certification, and ISV community engagements | ISV self-service portal
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance and Operations TechTalks | Customer Engagement TechTalks | Talent TechTalks
Recently I learnt how powerful the batch framework in AX is – and discovered how to improve performance of long running operations.
Most long running operations in Dynamics AX can be scheduled to run in batch. In most cases you can explicitly define a query to select which data to process. It is often simple to create a single query that selects all the data to process and then schedule the operation for batch. Doing it this way will start one batch operation that process the data piece by piece. It works, it is simple to maintain – but it’s not necessarily fast.
Instead consider defining multiple queries that each covers a portion of the data to process, and then schedule them all to run in batch at the same time! Now suddenly you have parallel processing.
Here is a real life example.
In Dynamics AX 2012 R3 we had a customer with 30,000 items that needed replenishment in their picking warehouse. They used fixed locations for each item, and used Min/Max replenishment. The replenishment operation in AX is defined using a template. The template consists of lines, each with a query to specify items and location to replenish.
The original setup we deployed was a single replenishment template covering all 30,000 items. The total execution time was 2hr31m:
Then we created a new template with a query that covered about half the items, and changed the original template to cover the other half. We scheduled both to start at the same time. Not surprisingly they completed much fast. Almost in half the time. Total execution time was 1hr21m:
Repeating the pattern, we split the replenishment into 8 templates. Another drop in total execution time was observed. Down to just 40 minutes:
At this point the system was quite loaded. CPU averaged 80% and SQL had constantly a few tasks waiting – that is a good thing, the hardware is meant to be exercised.
With a few simple configuration changes the overall execution time was cut by a factor 4.
This pattern can be applied many places – they key caveat to look out for is logical dependencies between the batches created. In the example above, it is important that two batches are not replenishing the same item or the same location. That could lead to one batch waiting for another batch to complete. The implementation will of course be transactional safe, even if there are dependencies between the batches. If there are dependencies it may not yield the same impressive results, and could result in some batches getting aborted.
Business Applications communities