web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id :

Performance best practice on batch job – Deploy measurement

Chris Bittner Profile Picture Chris Bittner

In Dynamics 365 for Finance and Operations Microsoft is proactively monitoring your environments and trying to address the issues before they become problems for you. During that monitoring effort, we saw several cases that customer runs into a performance challenge due to high DTU utilization on SQL side just because of the incorrect configuration on batch job “Deploy measurement” (class BIMeasurementDeployManagementEntityBatchJob). This article provides you the best practice on how to configure it, the existing hotfix, and upcoming improvement that you should be aware of.

For customers who are below Platform Update 23, we suggest that both customer and business partner will look into the reports you need for your BI and from there identify the aggregate measurements you need refreshed. Just go through the list of aggregate measurements and put them into the following categories (and note down which ones belong to which category for later reference).

A. measurements for which they don't use Power BI reports. These should not be processed.
B. measurements for which customer wants the data updated frequently.
C. measurements for which customer does not need data updated frequently.

Next you have to cancel the currently scheduled “Deploy measurement” batch jobs (from the batch jobs form under system administration) and then replace them with at most two instances of the batch job, one of which should handle measurements that need to be updated frequently (B), and the other for measurements that are rarely used (C).

To recreate the batch jobs; from the Aggregate measurements form (System administration > Setup > Entity Store), select only the high frequency measurements (B), click refresh to start a new batch job instance setting the recurrence to the desired frequency. Advice to run this batch job once or possibly twice a week.
Monitor the system for a while and see if it's still hitting high SQL Azure DTU while processing measurements. If its Ok, then select the low frequency measurements (C) and start a second instance of the batch job, scheduling those once a week or so.

In no case should the same measurement be selected for both batch job instances.


If you’re still in Application release 7.3 (first of all, you should take action immediately to upgrade to latest release and align with Microsoft’s OneVersion policy, check out OneVersion FAQ in here), please make sure you uptake below application hotfixes which help improve performance in relevant measurements


1. KB4464977 - Perf issue for VendPaymentBIInvoicesPaid which takes hours to push data from AXDB to Entity store (7.3 HF)

2. KB4458727 - Sales Cube/WHSWarehouse queries on SalesLineExpanded has poor performance when doing entity store full reset


As part of Platform Update 23 release, we introduced Automated entity store refresh. There are different refresh recurrences available there and we discontinued batch job BIMeasurementDeployManagementEntityBatchJob and replaced it with class BIMeasurementProcessorFull. We suggest you review the recurrence configuration for each of measurement there and make sure it aligns with customer’s business needs. 

 


In addition, we will introduce new improvement on this area in upcoming platform update release. Hence, it’s vital to stay current.

Comments

*This post is locked for comments