
If there is a D365 Finance & Operations integration requirement where the inbound data volume per run is 10,000+ records, and this volume may grow further. The system needs to import this data into custom tables and then process it within business logic.
I want to select the most reliable, scalable, and supportable integration approach.
I am considering the following options:
a) DMF / Data Entities (Recurring imports / BYOD / Staging)
b) Custom X++ service + asynchronous batch framework
c) Dual write scenarios
d) OData or Custom Service endpoints
e) Azure Data Factory + Data Lake staging
f) Queue-based integrations (Azure Service Bus / Event Grid / Batch)
My questions are:
For record counts of 10,000 to 100,000+, what integration approach is recommended?
Are data entities performant enough for this volume?
Does Microsoft recommend using DMF over OData when dealing with bulk inserts?
What are real-world performance limitations I should be aware of?
Any success stories or lessons learned from similar volumes?
Any guidance, architectural direction, or reference documentation would be greatly appreciated.