web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Suggested Answer

Odata performance

(3) ShareShare
ReportReport
Posted on by 1,965
Hi,
 
There is the sentence from one of microsoft articles
"If the volume is more than a few hundred thousand records, you should use the batch data API for integrations"
 
 
So when we say oData is not good for high volume of data and will make performance slow. what does that mean?
 
1. Does it mean if we get all customers via oData, and let's say number of customers we have is more than 100K as the article said, then this means we should not use odata? But if the number of customers was 50K then it's good?
 
2. Or does it depends also on the size of payload returned? i mean returning all customers in the system (100k) with all of their data, versus selecting the fields to return (i mean to still return the 100K customers but only return one field which is custAccount instead of all fields)
 
3. There is batch changeset for odata, is this only used to make two parallel calls? I mean get customers and get vendors at the same time or does it run in the background which might make it good for high volume?
I mean if i use batch changeset but only do one call to return the 100K customers, does it make a difference in terms of performance compared to the normal odata call without batch changeset?
Categories:
I have the same question (0)
  • Suggested answer
    Saif Ali Sabri Profile Picture
    2,351 Super User 2025 Season 2 on at
    Here is a clear and concise solution-style reply addressing each of your questions regarding OData performance in Dynamics 365 Finance and Operations:

    Summary
    OData is not optimal for high-volume data operations (e.g., >100K records) due to limited throughput, synchronous execution, and potential resource constraints. For large-scale data integrations, Microsoft recommends using the Data Management Framework (DMF) or Batch Data API instead.

    🔍 Detailed Answers
    1. Should we avoid OData if customer count >100K?
    Yes — if your integration requires retrieving all 100K+ customer records at once, OData is not recommended. It is synchronous and will:
    • Timeout or throttle on large payloads
    • Consume significant resources
    • Impact system performance and scalability
    If you're querying a small subset (e.g., filtered query or just 50K) and performance is acceptable, OData may still be viable. But always test performance with real data volumes.

    2. Does performance depend on payload size (selected fields)?
    Yes — payload size significantly affects OData performance.
    • Fetching 100K records with all fields (wide payload) is much heavier than fetching 100K records with just one field like CustomerAccount.
    • Reducing the fields minimizes:
      • Network load
      • Serialization/deserialization cost
      • Memory usage on the client and server
    Always use $select to limit fields and $filter to limit records when using OData.

    3. Does OData batch changeset improve performance for high-volume calls?
    No, using a batch changeset with a single large request (like 100K records) does not improve performance over a standard OData call. Here’s why:
    • OData batch changesets are primarily useful for:
      • Combining multiple small calls into one HTTP request
      • Sending changes in a transactional group
    • They do not run in the background or in parallel unless multiple requests are sent within the batch.
    🚫 If you're using a batch to fetch a large dataset in a single request, you’re still limited by the same OData performance bottlenecks.

    Recommended Solution
    Scenario Recommended Approach
    Fetch <50K records with few fields OData with $filter and $select
    Fetch >100K records or full dataset Use Data Management Framework (DMF) or Batch Data API
    High-volume writeback (insert/update) Batch Data API (asynchronous, scalable)
    Need background/parallel processing DMF or Custom batch job (X++)

    🔗 References

    Let me know if you need help converting your integration to DMF or using recurring data jobs efficiently.
     
  • .. Profile Picture
    1,965 on at
    Hi Saif,
     
    Thanks for coming back to me.
     
    I want to make sth clear for me please
    So i think based on the documentation to return 100K records with all fields is acceptable, but more than that is considered not good.
     
    My question now is, if i decide to return 100K records but select few fields, would that allow me to return more than 100K without affecting performance since i decreased the payload? Or is it that returning more than 100K records is not good regardless if i return all fields or few fields?
  • André Arnaud de Calavon Profile Picture
    300,917 Super User 2025 Season 2 on at
    Hi ..,

    100K or 50K records using OData is a lot. If you look at the example of reading a sales order status, Microsoft talks about 5000 records per hour. Consider the time required for 10 or 20 times the 5000.
     
    What is the actual requirement for your integration?
  • Anton Venter Profile Picture
    20,345 Super User 2025 Season 2 on at
    For large amounts of data, use the Data Management Framework. The framework supports API end points and you can import and export bulk data using files.
  • .. Profile Picture
    1,965 on at
    Hi @André Arnaud de Calavon@Anton Venter

    I don't have a certain scenario yet, but i'm trying to understand

    1. Andre Regarding the 5000 point you mentioned, also if you look at the API examples, they talk about 200K and 300K (they jumped alot). But I do agree with you that 100k sounds alot. So if let's say 20K is acceptable for Odata where i return all customer fields and above this number is no acceptable. My question is, if i start returning few fields for the customer, would now 30K become acceptable for Odata in this case?
    so the question is generic here regardless of the number: I mean if the limit was X, would this limit increase if we decrease the payload?
     
    2. When we say odata is not good for performance when returning high volume. Does that mean D365FO performance? i mean if we call odata to return 100K records, would people navigating the UI start facing performance issues?
     
    3. To continue on point 2, if the answer was yes to it, if we use DMF to export 100K records (without batch, i mean if we click export now) -- why wouldn't people face performance issues when navigating the UI?


    4. If we have 2 different customers calling the same odata api, would it make a difference in performance if we give them the same clientId? or is it better to give each one a separate AppId and why?

    @André Arnaud de Calavon
     
  • Suggested answer
    Saif Ali Sabri Profile Picture
    2,351 Super User 2025 Season 2 on at
    Can reducing payload (fewer fields) allow more than 100K records via OData?
    • Reducing payload helps, but doesn’t make OData scalable beyond 100K.
    • Even with fewer fields, high volume causes:
      • Timeouts
      • Throttling
      • Server resource strain
    🔹 Use OData for light loads only (e.g., <50K records). For anything larger, use DMF or Batch Data API.

    Does OData affect UI performance?
    • Yes — OData shares AOS resources with the UI.
    • Large OData calls can slow down the UI, especially under load.

    Why doesn’t DMF “Export Now” affect UI?
    • DMF runs asynchronously in batch servers, not blocking UI threads.
    • It’s optimized for large data exports.

    Should different customers use separate clientId (AppId)?
    • Yes — separate AppIds:
      • Prevent shared throttling
      • Improve monitoring and security

    🔚 Summary
    Case Use
    <50K records, few fields OData
    >100K records (any fields) Use DMF or Batch API
    Heavy UI load + integration Avoid OData
    Multiple customers Separate AppIds

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 664 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 522 Super User 2025 Season 2

#3
Sohaib Cheema Profile Picture

Sohaib Cheema 303 User Group Leader

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans