Skip to main content

Notifications

Community site session details

Community site session details

Session Id :
Customer experience | Sales, Customer Insights,...
Suggested answer

How to use an environment variable as the source in a dataverse dataflow

(2) ShareShare
ReportReport
Posted on by 9

In Dynamics 365 power app, I have a dataverse dataflow that uses power query to connect to a dataverse table. All works fine in dev, however everytime we deploy to another environment we have to manually update the datasource. Is there a way to use an environment variable e.g. in the following line in dataflow advanced editor..

let
  Source = CommonDataService.Database("oursite-dev.crm11.dynamics.com"),
  #"Navigation 1" = . . .

E.g. I want to be able to do something like this..

let
  Source = CommonDataService.Database(EnvironmentVariables("siteUrl")),
  #"Navigation 1" = . . .

Many thanks

  • Tom_Gioielli Profile Picture
    1,565 on at
    How to use an environment variable as the source in a dataverse dataflow
    I saw another response come through on this topic and wanted to follow-up with a clarification in case anyone comes across this question.
     
    The connector for Power Automate and Dataflows does not currently allow you to dynamically pass in a parameter value for the dataflow. So, any suggestions of using Power Automate to refresh the flow and pass in a parameter value will not work with currently available functionality.
     
  • Suggested answer
    Daivat Vartak (v-9davar) Profile Picture
    7,336 Super User 2025 Season 1 on at
    How to use an environment variable as the source in a dataverse dataflow
    Hello Barks365,
     

    You've hit upon a very common and frustrating challenge when deploying Dataverse dataflows across environments: the need to manually update data source URLs. Unfortunately, direct use of environment variables within the Power Query M code of a Dataverse dataflow is not directly supported in the way you've described.

    The CommonDataService.Database() function expects a literal string for the server URL, and it doesn't have a built-in mechanism to interpret environment variables.

    However, there are effective workarounds that allow you to achieve the desired dynamic behavior:

    1. Parameterized Dataflows with Power Automate (Recommended)

    • Concept:

      • Create a dataflow with a parameter for the server URL.
      • Use Power Automate to trigger the dataflow and pass the environment-specific URL as the parameter value. 

    • Steps:

      1. Create a Parameter in Dataflow:

        • In your dataflow, go to the Power Query Editor.
        • Create a new parameter (e.g., SiteUrl).
        • Modify your Source step to use the parameter. 
        • Sample Code: let
              Source = CommonDataService.Database(SiteUrl),
              #"Navigation 1" = ...
          in
              #"Navigation 1"

      2. Create an Environment Variable:

        • In your target environment, create an environment variable (e.g., SiteUrl) with the appropriate URL value. 

      3. Create a Power Automate Flow:

        • Use the "When a flow is triggered" trigger.
        • Add a "Get environment variable" action to retrieve the URL from the environment variable.
        • Add a "Run a Dataflow" action.
        • In the "Run a Dataflow" action, pass the retrieved URL as the value for the SiteUrl parameter. 

      4. Trigger the Flow:

        • You can trigger the flow manually or schedule it to run periodically. 
         

    • Advantages:

      • Dynamic URLs without manual editing.
      • Clean and maintainable.
      • Leverages Power Automate's flexibility. 

    • Disadvantages:

      • Requires Power Automate flow setup.
      • Adds a layer of complexity.  

    •  

    2. Configuration Table in Dataverse (Slightly More Complex)

    • Concept:

      • Create a custom configuration table in Dataverse to store environment-specific URLs.
      • Modify your dataflow to retrieve the URL from this table. 

    • Steps:

      1. Create Configuration Table:

        • Create a custom table (e.g., EnvironmentConfiguration) with a text column to store the URL.
        • Add a record for each environment, with the appropriate URL value. 

      2. Modify Dataflow:

        • Add a new query to your dataflow to retrieve the URL from the configuration table.
        • Use the retrieved URL in your Source step.
        • You can use a filter on the configuration table to get the record for the current environment. 
         

    • Advantages:

      • Dynamic URLs within Dataverse.
      • Centralized configuration management. 

    • Disadvantages:

      • Requires creating and managing a custom table.
      • Power Query M code becomes more complex.
      • Could be slower than other options.  

    •  

    3. Deployment Pipeline with Parameter Replacement (Advanced)

    • Concept:

      • Use a deployment pipeline (e.g., Azure DevOps pipelines) to replace the URL in your dataflow's M code during deployment. 

    • Steps:

      1. Parameter Placeholder:

        • In your dataflow's M code, use a placeholder for the URL (e.g., __SITE_URL__). 

      2. Deployment Pipeline:

        • Use a task in your deployment pipeline to replace the placeholder with the environment-specific URL.
        • You can retrieve the URL from an environment variable or a configuration file. 
         

    • Advantages:

      • Fully automated deployment.
      • No manual editing required. 

    • Disadvantages:

      • Requires setting up and managing a deployment pipeline.
      • M code becomes less readable.  

    Recommendation:

    • Parameterized Dataflows with Power Automate (Option 1) is generally the most recommended approach. It's relatively easy to implement, maintainable, and leverages Power Automate's strengths.
    • The configuration table option is also viable, but can add complexity.
    • The deployment pipeline option is best for advanced scenarios with automated deployments.

    Important Notes:

    • Always test your dataflows thoroughly in a non-production environment before deploying to production.
    • Ensure that your environment variables or configuration table records are secured appropriately.

    •  

    By implementing one of these workarounds, you can achieve dynamic data source URLs in your Dataverse dataflows, eliminating the need for manual updates during deployments.

     
    If my answer was helpful, please click Like, and if it solved your problem, please mark it as verified to help other community members find more. If you have further questions, please feel free to contact me.
     
    My response was crafted with AI assistance and tailored to provide detailed and actionable guidance for your Microsoft Dynamics 365 query.
     
    Regards,
    Daivat Vartak
  • Suggested answer
    Tom_Gioielli Profile Picture
    1,565 on at
    How to use an environment variable as the source in a dataverse dataflow
    I'm not aware of a way to get a Dataverse Environment variable into a Dataflow for the purposes of setting your source. Typically, I will use a parameter in the Dataflow itself to quickly and easily update all of my queries with a single update, and it is simply part of the deployment process for myself and my clients.
     
    This might be a great idea to add and upvote as something to add in to the Dataflow feature in a future release.
     
     
     
  • sebastiendb Profile Picture
    5 on at
    How to use an environment variable as the source in a dataverse dataflow
    I was able to query the environment variables through API while using a user that has environment 'system administrator' role as authentication method. However that API doesn't work with a user having 'system customiser' role. I was unable to call the API using service priniciple (application user which is linked to Entra ID app registration), but it's unclear why I doesn't work, because that service principle (who has 'system administrator' role) does work for authentication of the power automate actions of dataverse.
    Also another problem is that this API to retrieve the environment variables doesn't select the environment the dataflow resides in dynamically, by which I mean I had to manually encode the environment url (DEV/TEST/PROD) in the API call. This is doesn't resolve the problem when deploying a dataflow from one environment (DEV) to another (TEST), because the dataflow should indead dynamically switch connections (based on environment variables of the current environment).
  • Kinson-Lam (WCC) Profile Picture
    2 on at
    How to use an environment variable as the source in a dataverse dataflow
    Any clue for this?  I am keen to know how to use environment variables inside dataflow.  thanks
  • fshaikh Profile Picture
    218 on at
    How to use an environment variable as the source in a dataverse dataflow
    Hi, 
     
    Is there any solution to this problem yet?
     
  • UR-06110101-0 Profile Picture
    4 on at
    How to use an environment variable as the source in a dataverse dataflow
    Hello! Did you get a solution on this? I am facing the same issue

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Announcing the Engage with the Community forum!

This forum is your space to connect, share, and grow!

🌸 Community Spring Festival 2025 Challenge Winners! 🌸

Congratulations to all our community participants!

Adis Hodzic – Community Spotlight

We are honored to recognize Adis Hodzic as our May 2025 Community…

Leaderboard > Customer experience | Sales, Customer Insights, CRM

#1
Daivat Vartak (v-9davar) Profile Picture

Daivat Vartak (v-9d... 225 Super User 2025 Season 1

#2
Muhammad Shahzad Shafique Profile Picture

Muhammad Shahzad Sh... 106 Most Valuable Professional

#3
Eugen Podkorytov Profile Picture

Eugen Podkorytov 95

Overall leaderboard

Product updates

Dynamics 365 release plans