Utilizing Power Automate to Query Dataverse Data Outside of the Current Environment
The Power Platform offers hundreds of built-in connectors to retrieve data from internal sources like SharePoint, Dataverse, and Office 365, as well as external sources likes Docusign, Twitter, and Wordpress. Those connectors can achieve just about anything you need them to out-of-the-box. But every once in a while, you'll run into an issue that requires a custom solution. For instance, whenever you are logged into Power Automate you can use the Dataverse connector to query data in that environment. These connectors are adequate for single-environment data systems, but what if your company needs multiple environments, whether it's to split up organizational units or purposes for each set of data? Power Automate has legacy connectors that achieve this task, but they are not usable by all customers. There is also a new connector that will be released soon to address this for those interested (see more here).
You could use third-party API calls to achieve this task, or you could never leave the platform and achieve it using an HTTP action! Exploring this use case in particular is cool because familiarizing yourself with REST API calls within the platform makes your data reach nearly infinite. Most data sources offer extensive API documentation for developers to help improve the extensibility of their platforms. Accessing their APIs through Power Automate requires some knowledge of how to make calls, but once you understand it you will be able to improve migrations from external sources and drive general adoption to the Power Platform. I was unable to find great examples of the end-to-end process to access external Dataverse data (though many fabulous content creators have tackled aspects of it), so I wanted to catalog it in this post. This is a long read for those interested. But the purpose of this is to show how you can use Power Automate to expand access of your data, regardless of where it lives, to the entire Power Platform to be used in your apps, products, and flows.
Our Task
We want to see which customer accounts exist in the current environment and a second environment. We can find these by looking at through tables in each environment:
'
Using the Dataverse connector in Power Automate, you can find all of the accounts in your current environment. Here, I append each account to an array variable:
Without using a legacy connector for Dataverse to query the second environment, that's as far as we can go on this task.
Accessing Our Data
Microsoft provides extensive documentation on how to access Dataverse data using REST API calls with examples here. When creating data queries, you can choose which columns you'd like returned, how it will be ordered, the amount of data you'd like returned, and many other options to specify exactly what you will need. So that's great! Let's get our data using the following HTTP call:
https://{your-organization-url}/api/data/v9.2/accounts?$select=name
When you try that, you'll receive a 401 error, which means you lack permissions to that data. This is because we haven't provided an access token with our credentials, which is a common necessity you'll find across many API calls you attempt to make. For our business case, here would be the steps to gather that access token:
1. We need to register our Dataverse CRM application in Azure Active Directory to generate a client token for that environment. First, we'll visit portal.azure.com, navigate to the Active Directory homepage, and click "App registrations" on the lefthand pane. Once in App Registrations, click "+ New Registration" in the upper ribbon:
2. We'll give our application a name, select the supported account types, and enter your organization URL (can be found in the Power Platform Admin Center for each environment listed as your CRM URL):
3. Inside of the application registration, click "API permissions" in the lefthand pane, then click "+ Add a permission". From there we will select "Dynamics CRM" with enabled Delegated permissions. This will allow us to use impersonation in an API call for credentials:
4. Next, we'll click "Certificates & secrets" in the lefthand pane, then click "+ New client secret" with whatever date range you'd like the token to be active for. Copy the token Value (not Secret ID) before leaving the page and copy somewhere locally:
5. To finish our work in Azure AD, click "Overview" in the lefthand pane, and copy the Application (client) ID and Directory (tenant) ID to the same place you copied your token Value:
6. Go to admin.powerplatform.microsoft.com and find the environment we're trying to access from your list of environments. We need to let the environment know to trust the certificate we just made. In your environment homepage, you'll see "S2S apps" on the righthand side of the screen. Click "See all" underneath.
7. On the upper ribbon, click "+ New app user". For App, select any application you've created in that environment that works with Dataverse data (create a quick Model-Driven app if you don't have one already), and select "System Administrator" for the security role. After you create this new app user, the App ID should match the App ID you copied from Azure AD:
We have now registered our Dataverse environment data in Azure AD, then provided that app registration access inside of the Power Platform Admin Center. Now we can query data!
Building The Flow
In Power Automate, we'll need to use a POST REST API call to generate our user bearer token. In Azure AD, you'll see that https://login.microsoftonline.com/{tenant_id}/oauth2/token is one way to generate credentials for that registered application. Here are the HTTP POST request details (curly brackets to be replaced with the IDs you have saved locally):
URI: https://login.microsoftonline.com/{tenant_id}/oauth2/token
Headers: Content-Type = application/x-www-form-urlencoded
Body:
grant_type=client_credentials&
client_id={application_id}&
client_secret={client_secret}&
resource={your_organization_url}
When you make that POST call, you will receive an access token in the body of the response. Using a "Compose" action in Power Automate, we can capture that response to use in the next HTTP call. For me, that expression was outputs('HTTP_-_Post_Access_Token').body.access_token:
And at last, we can make the HTTP GET call to query for the data we want using these request details:
URI: https://{your_organization_url}/api/data/v9.2/accounts?$select=name
Headers:
Content-Type = application/json
Authorization = Bearer {Compose - Access Token Outputs}
OData-MaxVersion = 4.0
OData-Version = 4.0
From there, you can use the "Parse JSON" action to parse data in the response, then compare your EnvironmentAccounts Array to that JSON object to see where your environments overlap. There are a number of things you could do to improve this process, like using Tracked Properties in a Compose action to hide all of the ID's you use in this flow (generally important for API calls when you don't want to show your IDs easily):
For those interested in these steps and the countless other possibilities available to you, you can always respond to the post or find me on LinkedIn!
Conclusions
The extensibility of the Power Platform is one of the main selling points for the suite of services. No matter where your data lives currently, there will always be a way to find it to be used for querying and migration. I like this example because it shows a lot of the common problems you'll run into when dealing with HTTP requests i.e., delegating user access to the data, making custom queries, and the general formation of the body in POST requests and headers in GET requests. Making this data difficult to access is vital for organizational data privacy, but all of them (including Microsoft's excellent documentation site) will provide you with the tools to find what you need, and the Power Platform will give you tools to work with that data and build amazing things.

Like
Report
*This post is locked for comments