@Will and @Kevin,
Thanks for the prompt responses and feedback. I've been through the various links you provided which was very informative.
Just a bit more context on the current design.
1.1. File generated on bespoke ERP.
1.2. Data POSTed by invoking a Logic App RESTful API endpoint.
1.3. At a high level, Logic app does the following
1.3.1. Reads the file line by line (these are journal lines)
1.3.2. Invokes a D365FO Endpoint per line to process the line into D365FO
So, from my understanding on the links that were provided,
2.1. The "Recurring Integrations" capability and the "Data Management API" exist within the D365FO world (application suite). - @Will's Links
2.2. The "Import/Export Strategy - Batch API" and using a "Set based processing approach" also exist within the D365FO world (application suite) @Kevin's links.
Hence, Logic Apps in the current design is merely acting as a "facade"/"proxy" for invoking the core D365FO Endpoints. In theory, we could call those D365FO Endpoints directly. From a performance perspective, these are some of my thoughts.
3.1. We are essentially adding an extra hop to the process (i.e. going via Logic Apps)
3.2. Given some of the journal volumes we looking at (over 100K lines), it would be more efficient to be passing the entire data set to D365FO, and invoking some sort batch oriented utility (as per 2.1 and 2.2), instead of doing the reading(1.3.1.) and invoking(1.3.2) of the D365FO Endpoint on Logic Apps.
Once again, thanks for the advice. Much appreciated.