Managing deployments via DevOps pipelines it is a great way to implement your project’s ALM. However, I faced an unusual issue, some cloud flows were automatically deactivated after deployment.
This could have been caused by a missing connection in the target environment, however, this was not the case. The Cloud Flows have been successfully running before the deployment.
When a new solution-aware Cloud Flow is created in the DEV environment, if using a Cloud Flow Connector, the platform will prompt the user to create a connection for the selected connector. When doing so, a Connection Reference is automatically created.
The Connection Reference can be viewed as an intersect component between Cloud Flows and Connectors.
The Connection is used to store the authentication credentials that will be used to access the required data source, hence, the Connection component has an ownership aspect. In other words, the Connection is accessible by its creator but is not directly accessible by other system users. Hence, the Connection component cannot be deployed to other environments.
Cloud Flows might make use of premium connectors which will require a specific Power Automate license. This is not an impediment in DEV, where each developer has the correct licenses allocated, however for the PROD environment, the Connections are might use a single service account with a Power Automate license.
Let’s imagine a scenario when a Connection Reference for the Dataverse Connector is created in DEV and later deployed to PROD.
Let’s imagine the Connection Reference and the related Cloud Flow are part of Solution A. When solution A is deployed to PROD via DevOps ALM, the solution components will be owned by the Application User (Service Principal) used by DevOps pipelines.
In consequence, after the initial deployment, the Connection Reference and the Cloud Flow will be owned by the DevOps Application User in the PROD environment.
The imported Connection Reference in PROD will be referencing a Connection in DEV, using the Connection ID. Unfortunately, this will automatically break the Cloud Flow execution when imported for the first time. The platform provides the options via Solution explorer to relink the broken Connection References, however, future solution deployments might break again the same Connection References.
Thankfully Microsoft provided a simple mechanism that allows the DevOps pipeline to link the Connection Refences to a valid Connection ID in the target system. This is achieved during deployment via configurations.
Official Documentation: https://learn.microsoft.com/en-us/power-platform/alm/conn-ref-env-variables-build-tools
In another post, I explained how to use the Deployment Settings with Power Platform Build Tools to update the Connection References via Release Pipelines.
https://cipdyn.wordpress.com/2023/03/13/d365-transform-connection-references-devops-alm-pipelines/
However, a Connection Reference pointing to a valid Connection in PROD is not a complete fix, since the Connection Reference is updated by DevOps Application User which cannot access the Connections owned by the Cloud Flow Service Account (required for premium Connectors). If this occurs, the cloud flow will be imported, but the import process will turn it off.
The solution for this problem is to share the Connection created by Cloud Flow Service Account in PROD with the DevOps Application User.
The Connection can only be shared with Service Principals but not with normal users.
For this scenario, it is sufficient to allocate “Can Use” permissions to the DevOps Application User.
The Cloud Flows should now be safely imported using the DevOps pipelines.
*This post is locked for comments