Recently a client tasked me to develop a solution that ran one of their Azure Data Factory Pipelines without using a time-based trigger, but rather through an event-based trigger. While Azure Data Factory does allow event-based triggers, they are limited to only certain actions like when a new file is created, modified, or deleted in Azure Blob Storage, then automatically run a pipeline.
In the past, if you wanted to run an Azure Data Factory pipeline when a record was created or updated in a system like Dataverse, that involved calling Azure Data Factory using the REST API or Azure SDK, which can be difficult and time consuming for non-developers. Luckily, there is a way to run Azure Data Factory pipelines directly from Microsoft Power Automate Flow without any developer experience, and it can be set up very quickly.
Power Automate No-Code Solution
The only thing needed to get started setting up the Azure Data Factory connector is information such as Subscription, Resource Group, Data Factory Name, and Data Factory Pipeline Name. If you are planning to build this process through Power Automate Flow, you will need to get this information from the Azure Portal.
Below I outline the steps to easily set this up and get it running. All you need is to simply create a Dataverse trigger of when a field is added, modified, or deleted. In this example, let us have a change type of ‘Modified’ and run the Azure Data Factory pipeline when a Dataverse field is updated to a null value. You will want to choose the Azure Data Factory connector, and the ‘Create a pipeline run’ action after your Dataverse trigger step.
Pipeline Parameters
For the required fields on the Azure Data Factory connector, you can hardcode the values, but I prefer to use environmental variables if you plan on deploying these changes up to your Sandbox and Production environments. Otherwise, you will have to manually update those values in each environment. I also like to use a pipeline parameter value which allows you to pass values into the pipeline run.
Once the Power Automate Flow is built and all connections/subscription information is validated, you can then test the flow to see if it works. As a reminder, once the Power Automate Flows succeeds you can then check your Azure Data Factory pipeline progress under the Triggered Pipeline runs.
In conclusion, connecting Power Automate Flows to Azure Data Factory opens the door for smarter, event-driven integrations without the need for heavy coding. It is a simple, scalable way to bridge the gap between Microsoft Dynamics 365 and your Azure Data Factory pipelines. With just a few clicks, you can automate complex data movements and make your integrations more responsive to real business events.
Mike Mitchell – Senior Consultant
Working with New Dynamic
New Dynamic is a Microsoft Solutions Partner focused on the Dynamics 365 Customer Engagement and Power Platforms. Our team of dedicated professionals strives to provide first-class experiences incorporating integrity, teamwork, and a relentless commitment to our client’s success.
Contact Us 
today to transform your sales productivity and customer buying experiences.