Notifications
Announcements
No record found.
My concern is about importing large XML-files which makes up approx. to 50k records per day. We have a middleware which can do the transformation and splitting etc, but...How much can batch API handle? Is it even possible to input such a large amount of xml-data and pass it by through the POST call? Or would there be any wiser solution?
Where you want to import that data ? I have seen import of journal lines more than 50 k records without any issue using XML format using batch API.
Hi, the plan is to import journal lines. Does it need to be splitted in many sessions? I'm trying to understand how I should guide the developers behind the middleware system.
Ledger journal entity does set based operations so it's very fast and you don't need to spilt anything. Just export the xml format so that other system can generate exact format and pass it to the endpoint.
If the target journal entity supports set based operations, you shouldn't have any issues. However don't try to post 50 000 lines in one journal, instead split it to smaller journals (for example 500-1000 lines per journal).
If your entity doesn't support set based operations, you might want to think about some optimization in the integration.
I've written a blog post about the topic some years ago: community.dynamics.com/.../dynamics-365-for-operations-integration-performance-tuning-tips
Under review
Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.
As AI tools become more common, we’re introducing a Responsible AI Use…
We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…
These are the community rock stars!
Stay up to date on forum activity by subscribing.
Martin Dráb 663 Most Valuable Professional
André Arnaud de Cal... 540 Super User 2025 Season 2
Sohaib Cheema 348 User Group Leader