Hello Dynamics Community,
I am exploring the best approach to replicate data from Dynamics 365 Finance & Operations (D365FO) into Google Cloud Platform (GCP). My goal is to achieve near real-time data replication with a streaming architecture to facilitate analytics and other use cases in GCP.
Here’s my current scenario and the steps I’ve considered:
Current Scenario:
- Source: D365FO as the primary data source.
- Intermediate Storage: Potentially leveraging Azure Synapse Link, Azure Data Lake, or SQL Server for staging the data.
- Destination: BigQuery in GCP as the primary target for analytics and reporting.
- Integration Tools: Exploring solutions like Azure Data Factory (ADF), Debezium for CDC, or the Dynamics APIs.
Challenges:
- D365FO does not seem to natively support direct connectors to GCP (like the one for Dynamics 365 Customer Engagement).
- Determining the best architecture to minimize latency and ensure scalability.
- Choosing the right method for Change Data Capture (CDC) or incremental updates from D365FO.
- I would like to be able to select the tables and stored procedures outputs to be ran for the integration.
- Has anyone successfully implemented a similar integration between D365FO and GCP? If so, what tools or architecture did you use?
- What would be the most efficient way to extract and transform data from D365FO for real-time or near real-time replication to GCP?
- Are there specific third-party tools or middleware you recommend for this use case (e.g., to manage CDC)?
- Are there best practices for using Azure as a staging area before pushing data to GCP?
- Would leveraging D365FO APIs for this purpose be feasible at scale, or should I explore database-level replication options?
I would greatly appreciate any guidance, suggestions, or resources that can help me design an optimal solution.
Thank you in advance for sharing your expertise!