web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Small and medium business | Business Central, N...
Unanswered

Best Practices for Syncing SQL Change Tracking Data to Business Central

(1) ShareShare
ReportReport
Posted on by 149
Hi Experts,
We are planning to build a synchronization service from SQL Server to Business Central (BC) using change log tracking. The source system[SQL] will provide data in JSON format, and we need to push the changes into Business Central following best practices.
Below are the details and questions we are looking for guidance on:


Current Scenario

SQL Server has change tracking enabled.
  • Around 100 tables are involved.
  • Each table has fixed fields that need to be synchronized with Business Central.
  • Different synchronization frequencies are required:
  • Some tables: instant / near real-time
  • Some tables: daily
  • Some tables: every 2 days
Questions: 

Design & Architecture

1.What is the recommended architecture to handle SQL change tracking and push changes into Business Central efficiently?
2.How should we manage different synchronization frequencies for the same set of tables?

Queue-Based Processing

We are planning to use a queue-based approach.
Is this the right approach for handling large volumes and different sync intervals?
What are the best practices for retry handling, error logging, and scalability?

Implementation Options
Should this be implemented using:
  • Business Central Job Queue + AL code
  • An external service (Azure Functions, Logic Apps, Service Bus, etc.)
What is the recommended approach for a robust and scalable sync service?

Timing & Change Detection Logic

If a record is updated today, how should the sync timing be determined?
  • What is the best way to handle:
  • Incremental updates
  • Missed updates
  • Reprocessing failed records
Any suggestions, architecture diagrams, or real-world experiences would be highly appreciated.


Questions in my mind 

1. BC API throttling limits (600 req/5min) will bottleneck 100-table real-time sync
2. BC pessimistic locking causes sync failures when users edit records simultaneously
3. SQL Change Tracking auto-cleanup can lose unsynced changes if service is down >retention period
4. Network failures during batch operations create partial updates and data inconsistency
5. Managing 3 different sync frequencies (real-time/daily/2-day) creates dependency and ordering issues
6. BC validation rules not enforced in SQL cause repeated sync failures and queue backlog
7. No built-in monitoring means sync degradation goes unnoticed until users complain
8. Initial bulk data load (millions of records) hits throttling and takes days via API
 
I have the same question (0)

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Small and medium business | Business Central, NAV, RMS

#1
OussamaSabbouh Profile Picture

OussamaSabbouh 3,098

#2
Jainam M. Kothari Profile Picture

Jainam M. Kothari 1,556 Super User 2025 Season 2

#3
YUN ZHU Profile Picture

YUN ZHU 1,108 Super User 2025 Season 2

Last 30 days Overall leaderboard

Featured topics

Product updates

Dynamics 365 release plans