Skip to main content

Notifications

Announcements

No record found.

Customer experience | Sales, Customer Insights,...
Answered

Back end server load balancing (onpremise servers)

Posted on by 200

I have a system setup where we have some millions of customers and we run daily batches for each of them to calculate some daily time series for each account.

  • Each Account has account number and
  • We have a workflow for this task, which has min_account_number), max_account_number parameters.
  • Thus I can start a workflow to go thru accounts 1..10000, 10001..20000 and so on. 
  • We can have multiple back end servers 

Now my question is if we run e.g. 100 workflows at the same time:

1) FRONT END SEVERS: what parameters should I check in front end machines (is there a parameter, which limits how many workflows I can start at same time) ?

2) BACK END SERVERS: how to load performance for multiple back ends so that each of them is loaded equally (which parameters I should check there ) ?

3) Other considerations - for threading, locking etc ?

Cheers,

PEKKA

  • Verified answer
    Hugo Serras Profile Picture
    Hugo Serras on at
    RE: Back end server load balancing (onpremise servers)

    Yes. The jobs are consumed by that parameter. The jobs that go to that queue are not only workflows, but all other system jobs that the system may have to run. By default, the value for onPrem are 2000, and you should be careful changing that, in case you need, since you also have to keep in mind the kind of infrastructure you have.

  • Verified answer
    Pekka Häkkinen Profile Picture
    Pekka Häkkinen 200 on at
    RE: Back end server load balancing (onpremise servers)

    Hi - thank you for your help. So there are now specific parameters/limits on how many workflows one can start and have in progress in same time - only thing affecting that is number of BACK END servers and these ITEMSIN -parameters ?

  • Verified answer
    Hugo Serras Profile Picture
    Hugo Serras on at
    RE: Back end server load balancing (onpremise servers)

    Hello Pekka,

    I hope you are doing great.

    You do not have the need to load balance for multiple back ends. The parameters are explained below.

    Async polling is a continuous operation where the async server(s) will check the AsyncOperationBase table (database component) for jobs to execute. The asynchronous service polls the CRM Web service every 5 seconds by default, based on the AsyncSelectInterval value in the MSCRM_CONFIG database.

    Here are some terms:

    AsyncItemsInMemoryHigh – Max number of async operations the service will store in memory. Upon selection interval, if the number of items in memory falls below AsyncItemsInMemoryLow, the service will pick enough to reach up to AsyncItemsInMemoryHigh again.

    AsyncItemsInMemoryLow – Minimum number of async operation the service needs to have in memory before loading new jobs into memory. Upon selection interval, if the number of items in memory falls below this value, the service will pick up enough to reach AsyncItemsInMemoryHigh again.

    The polling works in batches based on the AsyncItemsInMemoryHigh value from the DeploymentProperties table in the MSCRM_CONFIG database, this is by default, 2000.

    I hope this helps.

    Best regards

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

December Spotlight Star - Muhammad Affan

Congratulations to a top community star!

Top 10 leaders for November!

Congratulations to our November super stars!

Tips for Writing Effective Suggested Answers

Best practices for providing successful forum answers ✍️

Leaderboard

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 291,269 Super User 2024 Season 2

#2
Martin Dráb Profile Picture

Martin Dráb 230,198 Most Valuable Professional

#3
nmaenpaa Profile Picture

nmaenpaa 101,156

Leaderboard

Featured topics

Product updates

Dynamics 365 release plans