web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Answered

Best integration approach for importing 10,000+ inbound records into D365 FinOps

(3) ShareShare
ReportReport
Posted on by 168

If there is a D365 Finance & Operations integration requirement where the inbound data volume per run is 10,000+ records, and this volume may grow further. The system needs to import this data into custom tables and then process it within business logic.

 

I want to select the most reliable, scalable, and supportable integration approach.

I am considering the following options:

a) DMF / Data Entities (Recurring imports / BYOD / Staging)

b) Custom X++ service + asynchronous batch framework

c) Dual write scenarios

d) OData or Custom Service endpoints

e) Azure Data Factory + Data Lake staging

f) Queue-based integrations (Azure Service Bus / Event Grid / Batch)

 

My questions are:

 
  • For record counts of 10,000 to 100,000+, what integration approach is recommended?


  • Are data entities performant enough for this volume?


  • Does Microsoft recommend using DMF over OData when dealing with bulk inserts?


  • What are real-world performance limitations I should be aware of?


  • Any success stories or lessons learned from similar volumes?


  •  

Any guidance, architectural direction, or reference documentation would be greatly appreciated.

Categories:
I have the same question (1)
  • Martin Dráb Profile Picture
    237,965 Most Valuable Professional on at
    The number of records alone isn't sufficient information. You need to think about how quickly you need to get them imported. It's a big difference whether you have an hour or a week for it. Please tell us more about the business scenario.
     
    Do you need to import the data ASAP, or rather wait outside of business hours?
     
    Please explain what you mean by "then process it within business logic". Can you bulk-import the data and run your business logic afterwards? Or do you require running some business logic during the import? Are you sure you can't avoid it?
     
    Also, consider whether the data can be processed in parallel.
     
    What should happen in case of a failure? Do you want to rollback everything? To skip just a particular record? Or something else?
     
    Do you need to send a response to the other system you integrate with?
     
    When you'll have a better idea about your requirements, you can check how and whether these requirements can be handled by the various options. You can't decide what's the best if you don't what for.
     
    Regarding your list of options, I don't know what you mean by Azure Data Factory + Data Lake staging, how dual-write would help you and what's the difference between Custom X++ service and Custom Service endpoints.
     
    Here are my answers to some of your questions:
    1. For record counts of 10,000 to 100,000+, what integration approach is recommended?
      It depends, as discussed above.
    2. Are data entities performant enough for this volume?
      Yes, bulk imports through data management APIs are pretty efficient, if you can use them (which depends on your specific requirements).
    3. Does Microsoft recommend using DMF over OData when dealing with bulk inserts?
      It depends on what you mean. I guess you want to compare the usual OData services (/data/ endpoint) with data management APIs. In that case, data management APIs are preferred for large volume of data. But note that the packages API is actually OData service too, therefore you technically have OData on both sides of the equation. And you can see it's not about OData, but about what the OData service actually does.
  • Sagar121 Profile Picture
    872 on at
    Hi,
     
    You can you get most of yours answers for this doc.
     
     
    Dual write is about duplicating data in dataverse. 
  • vishalsahijwani Profile Picture
    168 on at
    Thanks @Martin Dráb for your response , I need to understand for the situations where its a real time integration between D365 F&O and the external system and also if this can be done after business hours. Both Power Automate and Logic Apps have their own limitations of throwing the error "Rate limit exceeded" after certain data threshold. 
     
    I thought of Custom Service in D365 F&O but if the count is in the range of 10000 records to 100000 records in a day then even Custom Service will start having problems in handling the API requests of these huge number of records after a certain point of time.
     
    RIS - Recurrence Integration Scheduler can help for handling large data volumes up to a certain limit but I want to understand that if there is a radical solution that can help. 
     
    To be honest Custom Service is the horse to bet on but how can we make it more robust to handle large data volumes.
     
  • Verified answer
    Martin Dráb Profile Picture
    237,965 Most Valuable Professional on at
    That you have 100000 records doesn't mean that you must, or should, make 100000 separate requests. The overhead would slow everything down and it could cause so severe performance problems that Microsoft actively protects the ERP system from excessive number of requests.
     
    You can, for example, create a file with all the records and push it to F&O throw a single request of a data management API. Or you could create a smaller packages, e.g. by a few thousand or records are send them to a data management API, to a custom service or so. You won't have a problem with the number of incoming requests if you don't use any, e.g. if you have a process in F&O that gets the data from a message queue in Azure. Message queues are also good for load distribution, e.g. if you there are many requests in a short period, you just a longer queue instead of trying to make too many requests to F&O and run into the protection limits.
     
    Note that the problem isn't in Power Automate and Logic Apps; it's in the logic you have in there.
     
    RIS is a client application for calling data management APIs from a Windows machine based on files in a folder. You definitely should consider data management APIs (as the very first thing, I'd say), but you can call the API by any other mean, e.g. from an Azure Function, from Power Automate, directly from another application and so on.
     
    I don't know why you think that using a custom service is the most efficient way, but ultimately it's  about the business logic you implement there. You still can implement efficient bulk imports inside a custom service, but you also can use custom services in very inefficient way. And this applies to the other options too.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 503 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 434 Super User 2025 Season 2

#3
BillurSamdancioglu Profile Picture

BillurSamdancioglu 278 Most Valuable Professional

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans