Azure Data Factory is a cloud-based data integration service that allows you to create data-driven workflows in the cloud for orchestrating and automating data movement and transformation. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that can ingest data from disparate data stores.
A pipeline is a logical grouping of activities that together perform a task. The pipeline allows you to manage the activities as a set instead of each one individually. For example, you can deploy and schedule the pipeline, instead of the activities independently.
This time I want to share with you the step-by-step process on how to use Copy Activity in Azure Data Factory to copy data from Dynamics 365 Finance and Supply Chain using the Dynamics AX connector.
Note: To use this connector to copy data from Dynamics 365 Finance and Operations, refer to Dynamics 365's OData support and Authentication method. Please keep in mind that OData is not recommended for large data sets and frequent data load scenarios due to performance issues.
1. Log in to Azure Portal.
2. Search for Data factories
3. Create a new Data Factory instance
4. Once the deployment is successful, click on Go to resource
5. Inside the data factory click on Open Azure Data Factory Studio
6. Create a new Pipeline
7. Select Move & transform and drag the Copy data activity to it
8. Go to the Source tab, click + New, and select Dynamics AX (Connector). This connector is based on OData calls Open Data Protocol (OData) - Finance & Operations | Dynamics 365 | Microsoft Docs
9. Create a new linked service to specify the connection properties.
10. Specify the details to connect to a D365 environment and test the connection. For more information review Service endpoints overview - Finance & Operations | Dynamics 365 | Microsoft Docs (Register a web application with AAD)
11. In Set properties, for this example we will select the CustomerV3 data entity
12. Click on Preview data to see the data from the data entity
13. Define a new dataset for Sink which will be an Azure Blob Storage
14.In Select format for this example, we will select a CVS format
15. In the set properties, write a name and select New in Linked Service
16. In the New linked service, specify the details to connect to an Azure Blob Storage and test the connection.
17. Create a file path in the Azure Blob Storage
18. Publish All will publish the changes to the data factory
19. Click Debug to trigger a debug run of the pipeline
20. Click on the glasses icon to see the details
21. Check the files in the Azure blob storage
Thanks for reading,
Said
Acknowledgments: I want to thank my colleagues Samuel Ardila (Senior Customer Engineer) and Amy Flowers (Customer Engineer) for their contributions and peer review.
*This post is locked for comments