Shorter Customer Aging Snapshot Batch Run Times Using Smarter Bundle Sizes (AX 2012)
Our company had the desire to reduce the run time of the Customer Aging Snapshot Batch job. Before performing any performance analysis of my own I searched the web to see if others had the same issue. It turns out this is a common long running batch job. I took a look at the SQL statements and did not see any significant opportunities for better performance. The sheer volume of SQL statements is what takes the job so long to run.
I found that the job breaks itself up in customer bundles creating new task for bundles of 120 customers. When looking at the history of our runs I could see that several customer bundles were always the long running tasks. At first I thought that if could flag these customers as long running then I could add logic to the customer bundling to isolate these customers in their own bundle.
First I added logging to the job so that I could see how long each customer took to process. Sure enough I had a handful of customers that took much longer to process than others. As luck would have it those customers kept getting put in the same bundle due to their customer Id’s being similar. This bundle would take over an hour to process while other bundles would finish in minutes.
Next I thought how I could make the job itself detect which customers are the long running customers. The solution I ended up with has been in production for several months now and has brought the run time from an hour or two down to 20 minutes.
What the code now does is as each customer is processed it times the aging for that customer and places it in a record set. At the end of the bundle it inserts this record set into a new table. The record set contains Partition, Data Area, Customer, and the seconds to process. I then have a new job that takes the data in this new table and applies it to a new column directly in the customer table. I also created a new control table where I configure a target number of threads for the Customer Aging Batch Job along with some other control parameters. Now in the logic that breaks the job into customer batches it uses the target threads, number of customers and total time for all customers to create bundles that have a more even run time thus a quicker total run time.
Advantages to this approach besides the batch run time are that it maintains itself, I have a known number of threads, and that I did not have to change nor understand the internal aging logic. One obvious drawback is that I did not reduce any load on the system. If anyone is interested I can provide more details.
Below is a sample Batch History prior to the change (60 minute run time):
Here is a sample Batch History after the change (10 minute run time):
Here is a sample log of an individual task showing the varied customer run times:
Comments
-
-
Yadhav - I do not see the friend request but given you provided your email address I sent you something that shows the code changes.
-
-
-
Yadhav - I sent you a friend request through this forum. Respond with your email address and I can send a difference file which details the changes.
-
Hi Can you let me where what code and where it needs to be written. We have the same issue and want to resolve it ASAP
*This post is locked for comments