web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Answered

Batch Integration - D365FO

(0) ShareShare
ReportReport
Posted on by 5

We are currently trying to integrate financial information from our bespoke ERP into Microsoft Dynamics 365 FO. The process is very much a batch-oriented and we send approximately 300k-500k lines in a once off daily batch.

The current integration is designed to go through via an API that's developed on Microsoft's Azure's Logic Apps. The high level integration flow is as follows: 

1. Generate file 

2. POST the data by invoking a Logic Apps RESTful API endpoint.

3. Once the data is received in Logic Apps, the data is processed line by line into D365 FO. 

The line by line processing into D365 seems somewhat inefficient, as the number of lines can be fairly large (between 300K - 500K). This could take a while.

Is this the best/recommended approach that Microsoft offers to integrate large subsets of data? Or are there other more "batch-oriented" patterns that can be recommended.

I have the same question (0)
  • Verified answer
    WillWU Profile Picture
    22,361 on at

    Hi MrT007,

    Batch data APIs are designed to handle large-volume data imports and exports:

    docs.microsoft.com/.../recurring-integrations

    docs.microsoft.com/.../data-management-api

    Hope this helps.

  • Verified answer
    zhifeng Profile Picture
    on at

    Batch data APIs are designed to handle large-volume data imports and exports. It's difficult to define what exactly qualifies as a large volume. The answer depends on the entity, and on the amount of business logic that is run during import or export. However, here is a rule of thumb: If the volume is more than a few hundred thousand records, you should use the batch data API for integrations.

    https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/data-entities/integration-overview

    In addition to that, did you setup data entity with multiple threads? https://community.dynamics.com/365/financeandoperations/b/365foroperationstechnical/posts/dynamics-365-for-operations-integration-performance-tuning-tips

  • MrT007 Profile Picture
    5 on at

    @Will and @Kevin,

    Thanks for the prompt responses and feedback. I've been through the various links you provided which was very informative.

    Just a bit more context on the current design.

    1.1. File generated on bespoke ERP.

    1.2. Data POSTed by invoking a Logic App RESTful API endpoint.

    1.3. At a high level, Logic app does the following

    1.3.1. Reads the file line by line (these are journal lines)

    1.3.2. Invokes a D365FO Endpoint per line to process the line into D365FO

    So, from my understanding on the links that were provided,

    2.1.  The "Recurring Integrations" capability and the "Data Management API" exist within the D365FO world (application suite). - @Will's Links

    2.2.  The "Import/Export Strategy - Batch API" and using a "Set based processing approach" also exist within the D365FO world (application suite) @Kevin's links.

    Hence, Logic Apps in the current design is merely acting as a "facade"/"proxy" for invoking the core D365FO Endpoints. In theory, we could call those D365FO Endpoints directly. From a performance perspective, these are some of my thoughts.

    3.1. We are essentially adding an extra hop to the process (i.e. going via Logic Apps)

    3.2. Given some of the journal volumes we looking at (over 100K lines), it would be more efficient to be passing the entire data set to D365FO, and invoking some sort batch oriented utility (as per 2.1 and 2.2), instead of doing the reading(1.3.1.) and invoking(1.3.2) of the D365FO Endpoint on Logic Apps.

    Once again, thanks for the advice. Much appreciated.

  • WillWU Profile Picture
    22,361 on at

    Hi MrT007,

    Glad to hear that, please spare a little time to verified the helpful answer to close this thread.

  • Verified answer
    Martin Dráb Profile Picture
    237,805 Most Valuable Professional on at

    Note that you can still use LogicApps as the integration platform - it's often easier than building a direct connection. The app can then call a data management API to process the whole file, rather then processing records on by one.

  • Verified answer
    Rahul Mohta Profile Picture
    21,032 on at

    also could use data management/entities and enable set based processing

    this would bring all data once to staging in D365FO

  • MrT007 Profile Picture
    5 on at

    Hi Will WU,

    Unfortunately I set up this inquiry as a discussion and not a question, which is probably the reason why I can't "verify" this answer.

    Apologies.

    Regards

  • Martin Dráb Profile Picture
    237,805 Most Valuable Professional on at

    Not a problem, moderators can help. I've changed the type to 'Question'.

  • MrT007 Profile Picture
    5 on at

    Hi Martin,

    I 100% agree with your suggestion. I think using LogicApps almost like a front or proxy to expose the underlying D365 Data Management API is a good option.  And I think invoking an API from the Logic App that processes the entire file, rather than processing a record at a time will definitely be more efficient.

    Thanks for sharing your thoughts. Much appreciated.

    Regards

  • MrT007 Profile Picture
    5 on at

    Thanks Rahul. Based on my knowledge within the SQL world regarding "set-based" processing, I do think it would aid performance. I do understand there maybe challenges with regards to "composite" entities, but I think our use case is referring to the basic "journal" entity.

    Regards

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 664 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 522 Super User 2025 Season 2

#3
Sohaib Cheema Profile Picture

Sohaib Cheema 303 User Group Leader

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans