web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Unanswered

Best integration approach for importing 10,000+ inbound records into D365 FinOps

(0) ShareShare
ReportReport
Posted on by 155

If there is a D365 Finance & Operations integration requirement where the inbound data volume per run is 10,000+ records, and this volume may grow further. The system needs to import this data into custom tables and then process it within business logic.

 

I want to select the most reliable, scalable, and supportable integration approach.

I am considering the following options:

a) DMF / Data Entities (Recurring imports / BYOD / Staging)

b) Custom X++ service + asynchronous batch framework

c) Dual write scenarios

d) OData or Custom Service endpoints

e) Azure Data Factory + Data Lake staging

f) Queue-based integrations (Azure Service Bus / Event Grid / Batch)

 

My questions are:

 
  • For record counts of 10,000 to 100,000+, what integration approach is recommended?


  • Are data entities performant enough for this volume?


  • Does Microsoft recommend using DMF over OData when dealing with bulk inserts?


  • What are real-world performance limitations I should be aware of?


  • Any success stories or lessons learned from similar volumes?


  •  

Any guidance, architectural direction, or reference documentation would be greatly appreciated.

Categories:
I have the same question (0)
  • Martin Dráb Profile Picture
    237,880 Most Valuable Professional on at
    The number of records alone isn't sufficient information. You need to think about how quickly you need to get them imported. It's a big difference whether you have an hour or a week for it. Please tell us more about the business scenario.
     
    Please explain what you mean by "then process it within business logic". Can you bulk-import the data and run your business logic afterwards? Or do you require running some business logic during the import? Are you sure you can't avoid it?
     
    Also, consider whether the data can be processed in parallel.
     
    What should happen in case of a failure? Do you want to rollback everything? To skip just a particular record? Or something else?
     
    Do you need to send a response to the other system you integrate with?
     
    When you'll have a better idea about your requirements, you can check how and whether these requirements can be handled by the various options. You can't decide what's the best if you don't what for.
     
    Regarding your list of options, I don't know what you mean by Azure Data Factory + Data Lake staging, how dual-write would help you and what's the difference between Custom X++ service and Custom Service endpoints.
     
    Here are my answers to some of your questions:
    1. For record counts of 10,000 to 100,000+, what integration approach is recommended?
      It depends, as discussed above.
    2. Are data entities performant enough for this volume?
      Yes, bulk imports through data management APIs are pretty efficient, if you can use them (which depends on your specific requirements).
    3. Does Microsoft recommend using DMF over OData when dealing with bulk inserts?
      It depends on what you mean. I guess you want to compare the usual OData services (/data/ endpoint) with data management APIs. In that case, data management APIs are preferred for large volume of data. But note that the packages API is actually OData service too, therefore you technically have OData on both sides of the equation. And you can see it's not about OData, but about what the OData service actually does.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 584 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 499 Super User 2025 Season 2

#3
Sohaib Cheema Profile Picture

Sohaib Cheema 254 User Group Leader

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans