web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

No record found.

News and Announcements icon
Community site session details

Community site session details

Session Id :
Small and medium business | Business Central, N...
Suggested Answer

Best Practices for Syncing SQL Change Tracking Data to Business Central

(5) ShareShare
ReportReport
Posted on by 201
Hi Experts,
We are planning to build a synchronization service from SQL Server to Business Central (BC) using change log tracking. The source system[SQL] will provide data in JSON format, and we need to push the changes into Business Central following best practices.
Below are the details and questions we are looking for guidance on:


Current Scenario

SQL Server has change tracking enabled.
  • Around 100 tables are involved.
  • Each table has fixed fields that need to be synchronized with Business Central.
  • Different synchronization frequencies are required:
  • Some tables: instant / near real-time
  • Some tables: daily
  • Some tables: every 2 days
Questions: 

Design & Architecture

1.What is the recommended architecture to handle SQL change tracking and push changes into Business Central efficiently?
2.How should we manage different synchronization frequencies for the same set of tables?

Queue-Based Processing

We are planning to use a queue-based approach.
Is this the right approach for handling large volumes and different sync intervals?
What are the best practices for retry handling, error logging, and scalability?

Implementation Options
Should this be implemented using:
  • Business Central Job Queue + AL code
  • An external service (Azure Functions, Logic Apps, Service Bus, etc.)
What is the recommended approach for a robust and scalable sync service?

Timing & Change Detection Logic

If a record is updated today, how should the sync timing be determined?
  • What is the best way to handle:
  • Incremental updates
  • Missed updates
  • Reprocessing failed records
Any suggestions, architecture diagrams, or real-world experiences would be highly appreciated.


Questions in my mind 

1. BC API throttling limits (600 req/5min) will bottleneck 100-table real-time sync
2. BC pessimistic locking causes sync failures when users edit records simultaneously
3. SQL Change Tracking auto-cleanup can lose unsynced changes if service is down >retention period
4. Network failures during batch operations create partial updates and data inconsistency
5. Managing 3 different sync frequencies (real-time/daily/2-day) creates dependency and ordering issues
6. BC validation rules not enforced in SQL cause repeated sync failures and queue backlog
7. No built-in monitoring means sync degradation goes unnoticed until users complain
8. Initial bulk data load (millions of records) hits throttling and takes days via API
 
I have the same question (0)
  • Suggested answer
    Assisted by AI
    OussamaSabbouh Profile Picture
    11,401 Super User 2026 Season 1 on at
    Hello,
     
    Do NOT sync SQL → BC directly.
    Use an external, queue-based integration layer.
     
     
    ---
     
    Recommended setup
     
    SQL Change Tracking
    → Azure integration service (Functions/App Service)
    → Queues (Service Bus / Storage Queue)
    → Business Central APIs
     
    BC is a consumer only, not the orchestrator.
     
     
    ---
     
    Key points (very condensed)
     
    BC Job Queue: Not suitable for 100 tables or near real-time
     
    Queues: Mandatory for scale, retries, ordering, recovery
     
    Sync frequency: Handled outside BC (separate queues/pipelines)
     
    API throttling & locking: Real issues → solved with batching + retries
     
    Change tracking cleanup: Store last synced version externally
     
    Failures: Retry + dead-letter queue + idempotent writes
     
    Validation errors: Pre-validate before calling BC
     
    Monitoring: Azure App Insights / logs, not BC
     
    Initial load: Use RapidStart / Configuration Packages, not APIs
     
     
     
    ---
     
    Bottom line
     
    External Azure service + queues is the only scalable and reliable approach.
    Anything else will break under load.
     
     
    ---
     
    References
     
    https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/api-reference/v2.0/
     
    https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/webservices/use-odata-v4-web-services
     
    https://learn.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-overview
     
     
    Regards,
    Oussama Sabbouh
  • Suggested answer
    YUN ZHU Profile Picture
    98,287 Super User 2026 Season 1 on at
    There's a major caveat: if your Business Central is online (SaaS), it can't access your local SQL and requires a middleware layer, which can impact performance.
    I personally don't recommend doing this.
    In addition to the API solutions you mentioned, you can also save SQL output files, such as update logs, in places that BC can read, like SharePoint or OneDrive, and read the files outside of working hours to update the records in BC. This can overcome the limitations of APIs.
     
    Hope this can give you some hints.
    Thanks.
    ZHU
     
     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Stars!

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Small and medium business | Business Central, NAV, RMS

#1
OussamaSabbouh Profile Picture

OussamaSabbouh 1,986 Super User 2026 Season 1

#2
YUN ZHU Profile Picture

YUN ZHU 1,071 Super User 2026 Season 1

#3
Dhiren Nagar Profile Picture

Dhiren Nagar 975 Super User 2026 Season 1

Last 30 days Overall leaderboard

Featured topics

Product updates

Dynamics 365 release plans