Hello, I am trying to find out the best architecture to ingest data from both BC and Dataverse into Fabric. Since we don’t have much experience with Python and we don’t have many transformations to perform, I am trying to avoid using notebooks.
Currently, I am considering two options:
-
Ingesting data using Dataflow Gen2 – The issue here is that we need to manage incremental refresh, especially when records get deleted from one of the sources (either BC or Dataverse).
-
Using the BC2ADLS tool and Link to Azure Synapse– This would ingest data into Azure Data Lake, and finally ingest data into Fabric using shortcuts (if possible).
Which of the two approaches is better in terms of cost and performance, and are there other approaches to consider?

Report
All responses (
Answers (