web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

Community site session details

Community site session details

Session Id :
Small and medium business | Business Central, N...
Unanswered

General Journals - Integrating 2.5 million records per day

(0) ShareShare
ReportReport
Posted on by 15

Hi guys

Disclaimer:  I'm not a Business Central developer or consultant, I'm an integration developer working on connecting an external system to BC.  Everything I know about BC's API surface has come from online research, so I may well be missing something obvious to those of you in the BC ecosystem.  Please bear with me, and I'd really appreciate being pointed in the right direction.

Scenario

We're integrating daily transaction batches from a core banking system into Business Central Online.  The volumes are significant:

  • Test environment: ~400,000 records per day
  • Projected production: ~2,500,000 records per day

Current Approach

Based on what I could find in Microsoft's documentation, our workflow uses the standard BC API:

  1. Create a General Journal Batch
  2. Insert General Journal Lines via the $batch endpoint (max 100 records per call)
  3. Post the batch
  4. Delete the batch
Performance Observed
  • 8,900 records in 10 minutes
  • Extrapolating linearly (assuming no timeouts or failures):  7.5 hours for 400K records, and 48 hours for 2.5M records, obviously not viable for a daily process.
  • We attempted parallelizing (5 concurrent requests of 100 records each) but encountered key conflict errors.
What we've Explored
  • From my research, the $batch endpoint appears to be the fastest standard method for bulk ingestion, but again, I'm not a BC expert and may simply not know what I don't know.
  • I've read references to custom web services, staging tables with job queues, and polling patterns, but I'm not in a position to evaluate or recommend these without hands-on implementation experience in BC.
Questions
  1. Are there faster standard endpoints or built-in BC functionality for bulk data ingestion that I'm not aware of?  If so, I'd be grateful for a point in the right direction, documentation links, feature names, anything to get me started.
  2. If the $batch endpoint is indeed the ceiling for standard API throughput, what customization approach would you recommend for ingesting this volume?  Staging tables + job queues?  Custom web services?  Something else entirely?
  3. Has anyone successfully handled similar daily volumes (hundreds of thousands to millions of journal lines) in BC Online?  What architecture did you land on, and what pitfalls should we watch out for?

I just want to be as sure as possible that we've exhausted all options in terms of increasing performance using BC's standard API surface, before advising the client to go down the customization path.  We also logged a support call with Microsoft but were advised to post here for community guidance.

Any help is hugely appreciated. Thanks in advance.

I have the same question (0)

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Stars!

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Small and medium business | Business Central, NAV, RMS

#1
OussamaSabbouh Profile Picture

OussamaSabbouh 2,395 Super User 2026 Season 1

#2
YUN ZHU Profile Picture

YUN ZHU 1,237 Super User 2026 Season 1

#3
Dhiren Nagar Profile Picture

Dhiren Nagar 1,075 Super User 2026 Season 1

Last 30 days Overall leaderboard

Featured topics

Product updates

Dynamics 365 release plans