Skip to main content

Notifications

Announcements

No record found.

Refactoring Pipelines and Implementing PowerApps Checker in DevOps for Dynamics 365 for Customer Engagement Solutions - DevOps Part 2

Welcome to the next entry in our blog series on DevOps for Dynamics 365 for Customer Engagement (D365 CE). Up until now, we've covered the following:

This article will build on the material covered so far, so if you want to follow along and you haven't walked through those articles yet, I'd recommend you do so.

Contents

Introduction

Often times in large development efforts, a codebase will rapidly approach a critical, messy mass where additional features become exponentially more difficult to implement. The developers' impulse here may be to stop all new feature development and redesign the entire system, but this is never a good idea. Instead, we need to start making iterative changes to our code to clean it as we add new features. In this article, we are going to be celebrating this philosophy.

As a new deliverable in our DevOps environment, we will be enforcing some code quality rules by including PowerApps Checker in the build stage of our pipeline. We can consume the results of the report generated by this module to prevent developers from introducing common D365 CE anti-patterns to our codebase.

Making Pipelines Reusable

It is very likely your D365 CE deployment will consist of more than one solution. While the components that comprise your solution may be distinct, the pipelines you use to deploy these solutions will be quite similar, if not completely identical. That said, it would be an ironic shame if we were to start copying and pasting our code while incorporating DevOps.

There are many strategies for organizing your D365 CE solution architecture in source control. We will be exploring some of them in later blog posts, but for the purposes of this article, let's assume that we've landed on a strategy which includes storing our pipeline templates in a separate repository. You can import, fork, or directly reference our pipeline repository, or create your own.

Introducing Stage Templates

Stage templates allow us to write the YAML for our stages in separate files, enabling us to gain the power of new multi-stage pipelines combined with the cleanliness of modularity. At the time of writing this article, there are templates for stages, jobs, steps, and variables. Eventually, we will be using each of them, but for now, we'll start with stage templates, leaving the world a little better than we found it.

stages:
- stage: Build
  jobs:
  - job:
    displayName: "Pack Solution from repository"

Excerpt from stages/build.yml

For the full source code of the stage template, click the link above. Since we're planning on using other template types here, let's think ahead a little bit and place our stage templates in a folder. Of course, the eventual increasing complexity of this project may call for that organizational strategy to change.

stages:
- stage: Release

  dependsOn: Build
  condition: succeeded('Build')

  jobs:
  - deployment: Deploy
    displayName: "Import solution artifact"

Excerpt from stages/release.yml

Migrating the Release stage to an external template should be a practice of repetition at this point. We will be improving these stage templates soon, and I encourage you to experiment with how you can find an organizational strategy that works for your team.

Referencing a Pipeline Repository From a Pipeline

Since the stage templates we've created are tracked in an external repository, we will need to make them available by adding a repository resource to our pipeline. The steps for this are going to vary slightly based on where your pipeline templates repository is stored:

Pipeline Templates in Azure Repos

If you chose to import our repository or create your own in Azure Repos, you can simply reference it by name in a repository entry, as explained in the documentation for the type parameter.

name: $(BuildDefinitionName)-$(Date:yyyyMMdd).$(Rev:.r)

trigger:
- master

resources:
  repositories:
  - repository: templates
    type: git
    name: pipeline-templates

stages:
- template: stages/build.yml@templates
- template: stages/release.yml@templates

azure-repos-build-release.yml

Pipeline Templates in GitHub

If you are using GitHub for your repository, you will need to create a service connection, if you don't have one already. Be sure to adhere to the principle of least privilege. That is, you only need to read from a public repository, make sure your service connection only has that privilege.

Steps to Create a GitHub Service Endpoint
  1. Create a personal access token in GitHub
    1. From GitHub, click on your Profile icon.
    2. Click Settings.
    3. Click Developer Settings.
    4. Click Personal access tokens.
    5. Click Generate new token.
    6. Enter a note, such as "Read Public Repositories", and check the box for public_repo.
    7. Click Generate token.
    8. Click the clipboard icon to copy the token.

Create a personal acces token in GitHub

  1. Create a service connection in Azure DevOps.
    1. From Azure Devops, click Project Settings.
    2. Click Service connections (in the Pipelines category).
    3. Click New service connection.
    4. Click GitHub.
    5. For Choose authorization, select Personal access token.
    6. Enter a Connection Name, such as "pipeline-templates".
    7. For Token, paste the token from your clipboard.
    8. Click OK.

Create a service connection in Azure DevOps

Now that you have a service connection, following the documentation for the type parameter, you can reference your repository by name and set the endpoint parameter to the name of your service connection. Also, if you want to point to a specific version of the external repository, supply the tag in the ref parameter. For example, if you are using our public pipelines repository, for the purposes of this tutorial, enter a value of refs/tags/blog-part-2.0 for a version consistent with what we've learned so far. We will be updating this in a moment.

name: $(BuildDefinitionName)-$(Date:yyyyMMdd).$(Rev:.r)

trigger:
- master

resources:
  repositories:
  - repository: templates
    type: github
    name: microsoft-d365-ce-pfe-devops/D365-CE-Pipelines
    ref: refs/tags/blog-part-2.0
    endpoint: pipeline-templates

stages:
- template: stages/build.yml@templates
- template: stages/release.yml@templates

azure-repos-build-release.yml

After the changes we've made, if you attempt to run your Build & Release pipeline, you shouldn't notice any change in behavior.

Include PowerApps Checker in a Pipeline

At the time of writing, the D365 CE product group has just released the PowerApps checker PowerShell Module. In conjunction with this module, they have also released a collection of official DevOps tasks for invoking the service, and we will cover them in another article in the near future. For now, we will be using the PowerShell module. Let's add it to our environment and provide a report on our built solution.

For the purposes of this stage, we will be using the Invoke-PowerAppsChecker cmdlet. In order to call this, we're going to need a few new variables in our pipeline. At a minimum, we will need the following:

  • Geography (or ApiUrl)
  • Azure Tenant Id
  • Client Application Id
  • Client Application Secret
  • Ruleset Id

Create an Application Registration for PowerApps Checker

In order to add the Client Application Id / Secret, we will need to create an App Registration in Azure Active Directory. The documentation on PowerApps Checker includes a script to do this for you. If you'd rather do this manually from the portal, you can follow the instructions in the "How to", using the configurations provided below. If you choose to use the script, you can skip to generating a client secret.

Per the documentation on PowerApps Checker, provide the following options when creating your App Registration:

  • Redirect URI
    • Type: Public client (mobile & desktop)
    • Redirect URI: urn:ietf:wg:oauth:2.0:oob

Create a Client Application in Azure Active Directory

Then, in API permissions, grant access to the PowerApps-Advisor API, giving Application permissions to Analysis.All.

Grant application permissions of Analysis.All for the PowerApps-Advisor API

Finally, generate a secret for the registration.

Generate a client secret

Create Additional Pipeline Variables Required to Call PowerApps Checker

Now that we have a client application that we can use to authenticate with the PowerApps Checker service, we can create the additional pipeline variables. Open up the pipeline we currently use for Build / Release and add the following variables:

  • azure.geography - Region of data center to temporarily store reports generated by the PowerApps Checker service. Read more.

    Note: Based on your implementation, you may need to use ApiUrl instead. If so, be sure to modify the YAML file accordingly.

  • azure.tenantId - Example: ab2cea59-d1df-4950-ad30-3d7f43d1a8d8

  • powerAppsChecker.clientId - Example: 24076c4b-1a57-4ca1-9da5-42f367cf57d8

  • powerAppsChecker.clientSecret - Generated in the steps shown above. Be sure to change this variable type to secret.

You can get azure.tenantId and powerAppsChecker.clientId from the Overview tab for the App registration you created.

PowerApps Checker Client Application Registration Overview

Retrieve the Ruleset Id

The final variable we will need in order to invoke PowerApps Checker is powerAppsChecker.rulesetId. You can use the Get-PowerAppsCheckerRulesets cmdlet to retrieve the available guids. At the time of writing, there are two rulesets available. The guids will never change, so you are free to store these as a variable to re-use them for your pipelines.

  • powerAppsChecker.rulesetId
    • 083a2ef5-7e0e-4754-9d88-9455142dc08b for AppSource Certification
    • 0ad12346-e108-40b8-a956-9a8f95ea18c9 for Solution Checker

Install Sarif Viewer Build Tab

The PowerApps Checker service will generate a JSON file following the SARIF schema. Once we invoke the service from our pipeline, we will want a convenient way to view the results. The Microsoft DevLabs team has released an experimental, open-source Azure DevOps extension for this called Sarif Viewer Build Tab. Open that link and install it into your Azure DevOps organization in order to view the results of PowerApps Checker within your Pipeline summary.

Invoke PowerApps Checker From a Test Stage in a Pipeline

We now have all of the pieces in place to add an additional stage to our Pipeline and start invoking PowerApps Checker. Add the following YAML file to your pipelines repository, or if you're referencing ours, update your reference to use the blog-part-2.1 tag, which contains the newest test.yml file.

stages:
- stage: Test

  dependsOn: Build
  condition: succeeded('Build')

  jobs:
  - job:
    displayName: 'Run PowerApps Checker'

    pool:
      vmImage: 'vs2017-win2016'

    steps:
    - task: DownloadBuildArtifacts@0
      inputs:
        buildType: 'current'
        downloadType: 'single'
        artifactName: 'drop'
        downloadPath: '$(System.ArtifactsDirectory)'

    - powershell: Install-Module -Name Microsoft.PowerApps.Checker.PowerShell -Scope CurrentUser -Force
      displayName: 'Install Microsoft.PowerApps.Checker.PowerShell'

    - powershell: |
        md '$(Common.TestResultsDirectory)\powerapps-checker'

        Import-Module Microsoft.PowerApps.Checker.PowerShell
        $ruleset = New-Object Microsoft.PowerApps.Checker.Client.Models.Ruleset
        $ruleset.Id = [Guid]::Parse('$(powerAppsChecker.rulesetId)')

        Invoke-PowerAppsChecker `
          -Geography $(azure.geography) `
          -ClientApplicationId $(powerAppsChecker.clientId) `
          -TenantId $(azure.tenantId) `
          -Ruleset $ruleset `
          -FileUnderAnalysis '$(System.ArtifactsDirectory)\drop\packedSolution\$(solution.name)_managed.zip' `
          -OutputDirectory '$(Common.TestResultsDirectory)\powerapps-checker' `
          -ClientApplicationSecret (ConvertTo-SecureString -AsPlainText -Force -String '$(powerAppsChecker.clientSecret)')
      displayName: 'Invoke PowerApps Checker'

    - powershell: md '$(Common.TestResultsDirectory)\powerapps-checker\unzipped'
      displayName: 'Create folder for unzipped results'

    - task: ExtractFiles@1
      inputs:
        archiveFilePatterns: '$(Common.TestResultsDirectory)\powerapps-checker\*.zip'
        destinationFolder: '$(Common.TestResultsDirectory)\powerapps-checker\unzipped'
      displayName: 'Extract results to folder'

    - task: PublishBuildArtifacts@1
      inputs:
        pathtoPublish: '$(Common.TestResultsDirectory)\powerapps-checker\unzipped'
        artifactName: CodeAnalysisLogs
      displayName: 'Publish PowerApps Checker report artifacts'

stages/test.yml

Note that at the time of writing, it is important that your artifact be called "CodeAnalysisLogs". The current version of the Sarif Viewer Build Tab extension is hard-coded to only read from that folder. Also, the tab extension will not read from zip files, so you need to extract the SARIF files into that folder.

You will also need to update your main YAML file to include a reference to the newly created stage:

name: $(BuildDefinitionName)-$(Date:yyyyMMdd).$(Rev:.r)

trigger:
- master

resources:
  repositories:
  - repository: templates
    type: github
    name: microsoft-d365-ce-pfe-devops/D365-CE-Pipelines
    ref: refs/tags/blog-part-2.1
    endpoint: pipeline-templates

stages:
- template: stages/build.yml@templates
- template: stages/test.yml@templates
- template: stages/release.yml@templates

github-build-test-release.yml
(Azure Repos version: azure-repos-build-test-release.yml)

Once you've committed that, you should be able to run your pipeline and watch as the Test stage completes prior to beginning the Release stage. Once the Test stage completes, from the Pipeline review, you should see a "Scans" tab, which will show the results of the PowerApps Checker.

PowerApps Checker Scans Tab (No Results)

However, right now, if you are using the sample solution from our tutorial repository, this view is pretty unexciting. Feel free to verify it by adding a solution component that violates a PowerApps Checker rule. Alternatively, you can download and check in the newest ExtractedSolution folder from our tutorial repository for a version of the solution containing a JavaScript file that should throw off some red flags:

PowerApps Checker Scans Tab (No Results)

That's it for this article! There's still plenty of room to add value here. As a fun challenge for yourself, see if you can parse the results of the PowerApps Checker programmatically and actually cause the Test stage to fail if any results are found. We will be providing a solution to this in our next entry. (Hint: The PowerApps Checker result includes a property called IssueSummary.)

Thanks for reading, feel free to ask any questions in the comments section below, and happy DevOps-ing!

Comments

*This post is locked for comments

  • Tyler Hogsett Profile Picture Tyler Hogsett
    Posted at
    Hey Artem! You are definitely asking the right questions. These are all topics we are still discussing within the PFE team and with the D365 CE product group. There will be more official information available for this in the near future, but I can give you some of the basic premises that we are considering in approaching these topics. The biggest pain point in customizing D365 CE is that traditionally, the platform has been its own editor, and doesn't have a native way to track and version control changes. The product group is working towards separating this coupling; for example, the official place to manage solutions is now https://make.powerapps.com, instead of the legacy web app UI. For the purposes of DevOps, we want to be thinking the same way, and treat our environments as being part of our IDE. 1. We are big fans of the Gitflow Workflow. If you aren't familiar with it, Atlassian does a pretty good job explaining it: www.atlassian.com/.../gitflow-workflow Consider building out your CI / CD process around the Gitflow Workflow in combination with the premise that each environment is part of a developer's IDE. 2. Following from the information I've provided, each developer should probably be working in one org per branch they are working in. The exception here would be if two developers are collaborating on the same feature. If two or more developers are actively making changes in a single org, at the very least, they should be in a call together, actively sharing what they're working on in conversation. Think of shared development orgs in the same way you might think of a Visual Studio Live Share session. Merging changes during concurrent development is perhaps the biggest gray area for us right now. Again, I would emphasize thinking of an environment as being part of your IDE, and consider how you merge changes for any code-based project: in order to successfully merge from a feature branch down into the develop branch, the feature branch needs to be ahead of your develop branch. So any changes that have been made to the develop branch need to be pulled back into the feature branch and merged there by the developer. In this scenario, you might consider building out some tooling for automatically merging any changes that can be done trivially without chance of conflict, and then performing manual steps to merge potential conflicts. It should feel as close as you can get to the "ours vs. theirs" approach when working in your other IDE's. 3. We don't have an official recommendation for solution versioning, but I would recommend considering whether you are going to need patches as part of your deployment process. In my own opinion, patches should be thought of as hotfixes, and though this is part of the Gitflow Workflow, in an ideal world, hotfixes should only be needed in an absolute emergency, which should be rare if you are employing automated testing in your process. Otherwise, "urgent" changes should never be so urgent that you need to break your deployment process in order to implement them. This is largely thanks to the fact that you are deploying small sets of changes to production as often as possible. If you are not using patches, honestly, the only law of solution versioning is that your versions be sequential, so that if you are looking at a feature branch environment, the solution version there should be ahead of develop if no changes have been made to develop since the branch was created, but it should be behind develop if another feature branch has been merged into develop after the feature branch I'm working on was created, and prior to me merging the changes from develop back up into my feature branch. There are many changes coming up regarding solution layering and management, so continue to stay tuned for those. We will also be elaborating on conceptual questions like the ones you have asked here in future blog articles. Thank you for your interest! T
  • Artem Grunin Profile Picture Artem Grunin
    Posted at
    Thanks for the article! Can you please also share more info about: 1. Your branching strategy + CI/CD setup to support that 2. Environment setup: shared development orgs? separate orgs for branches? what flow do you have to move solutions between environments? 3. How do you do solution versioning? Is it manual, scripted, automated?