Introduction
With Tier 1 environments been retired on 1st February, most customers typically use the build environment to create build releases and it should be fairly simple to move them to one of the options below.
There are two options going forward to create a build:
- Create a new cloud-hosted environment to replace the Microsoft-managed Tier-1 environment.
- Use Microsoft-hosted agent on Azure Pipeline which means we are no longer needed to rely on a dedicated environment to run a build.
This blog post will be focusing on the second option as this is a more cost-effective and low maintenance solution for most customers when the extended capabilities of running a dedicated VM are not needed. We will be demonstrating an example of how you set these all up, ready to schedule builds on Azure Pipeline without the need of a VM, this means no patching to the operating system and servicing the Dynamics 365 Finance & Operation Apps.
Table of Contents
- Cost Consideration
- Azure Pipeline – Microsoft-hosted agent
- Azure Artifacts
- Example of Implementation
- Lifecycle Services
- Azure DevOps Artifacts
- Visual Studio Solution for the Build
- Configure DevOps workplace
- Create Visual Studio Solution and Projects
- Azure DevOps Pipelines
- Publish NuGet Packages
- Create Azure Active Directory Application
- Create a Service Connection in Azure DevOps
- Create Azure Pipeline
- Build Automation
- Create Azure Pipeline
- Publish NuGet Packages
- For version 10.0.18 onwards
- Conclusion
1 Cost Consideration
1.1 Azure Pipeline - Microsoft-hosted agent
Azure DevOps provides a free tier of service by default for every organization:
- One free parallel job that can run for up to 60 minutes each time
- 1,800 minutes (30 hours) per month
Each parallel job allows you to run a single job at a time in your organization. If you are on a free tier, you would only have one pipeline to execute at a time. You will have to purchase additional parallel jobs to enable multiple pipelines execute at the same time.
When additional parallel jobs are purchased, they remove the monthly time limit and allow each job to run for up to 360 minutes (6 hours).
Configure and pay for parallel jobs - Azure DevOps | Microsoft Docs
1.2 Azure Artifacts
Azure Artifacts includes a free usage tier of 2 GB. If the usage exceeds the limit, the actual usage will be charged.
Azure DevOps Services Pricing | Microsoft Azure
2 Example of implementation
The goal of this example is to set up two pipelines:
- Publish NuGet Packages to Artifacts
- The is the pipeline that downloads the application references and tools from LCS and publishes them into Azure DevOps Artifacts
- Build Automation
- This is the pipeline that would create the build.
- Because the pipeline is using a Microsoft-hosted agent, some pre-requisites steps need to be executed before compiling the code and creating the deployable package. These steps will consume the packages from Azure DevOps Artifacts we created on the previous point.
2.1 Lifecycle Services
- Navigate to Asset Library in the LCS project
- Select NuGet packages from the asset type
- Click 'Import' and pick the F&O version that you would like to use in the build. This is the version your package will be compiled and should be in line with the environment where this package will be deployed, whether it’s a testing or production environment.
- There are three packages to be imported for version 10.0.17 or earlier:
- Platform Build Reference
- Application Build Reference
- Compiler Tools
- There are four packages to be imported for version 10.0.18 or later:
- Platform Build Reference
- Application Build Reference
- Application Suite Build Reference
- Compiler Tools
- There are three packages to be imported for version 10.0.17 or earlier:
- Select each package file in LCS and take a note of the 'Description', this is the version of the package that would be needed later
2.2 Azure DevOps Artifacts
- Navigate to Azure DevOps > Artifacts > Create Feed > Create new feed
- Click 'Connect to feed'
- Choose NuGet.exe
- Create a nuget.config file locally by copying the parameters from DevOps, it should look like this
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<packageSources>
<clear />
<add key="Dynamics365Build" value="https://testproject.pkgs.visualstudio.com/e2c39476-07fc-4c47-a256-eebeaf0b5084/_packaging/Dynamics365Build/nuget/v3/index.json" />
</packageSources>
</configuration>
- Create a packages.config file locally, uses the following template, and insert the version of the package you would like to use in the build. The version can be taken from the previous step. The version will need to be updated whenever you would like to use a newer F&O version.
For version 10.0.17 or earlier
For version 10.0.18 or later
<?xml version="1.0" encoding="utf-8"?>
<packages>
<package id="Microsoft.Dynamics.AX.Platform.DevALM.BuildXpp" version="7.0.5968.16973" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.Application.DevALM.BuildXpp" version="10.0.793.16" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp" version="10.0.793.16" targetFramework="net40" />
<package id="Microsoft.Dynamics.AX.Platform.CompilerPackage" version="7.0.5968.16973" targetFramework="net40" />
</packages>
- Configure retention policies to keep the storage cost to a minimum, be mindful of the 2GB free storage space
- Navigate to 'Feed settings' by clicking on the 'Wheel' icon
- Under 'Retention policies' section, enable 'Enable package retention'
- Set the 'Maximum number of versions per package' and 'Days to keep recently downloaded packages'
2.3 Visual Studio Solution for the Build
Create a Dynamics 365 project for each package that you would like to include in the build. Assign one of the models within that package to the Dynamics 365 project.
In the following example, there are two packages to be built, we are going to create two Dynamics 365 projects within a solution. The project doesn't have to contain any objects.
2.3.1 Configure DevOps workspace
- Create a new folder in Azure DevOps > Repos
- This can be created within the Main folder or each branch
- Click on the three dots in the file tree > New > Folder
- In this example, we are creating a new folder 'BuildAutomation'
- Create a new folder locally. e.g. C:\BuildAutomation
- Map the source control folder to the local folder
- Edit the workspace in Visual Studio and add a new binding
2.3.2 Create Visual Studio Solution and Projects
- Create a new Dynamics 365 Project in Visual Studio for Model A
- Set the name to the name of the model
- Set the location to the local folder that was created in the previous step.
- Set the solution name to a meaningful name e.g. BuildAutomation
- Make sure both checkboxes are enabled
- Create directory for solution
- Add to Source Control
- Assign the model to the Project
- Create a new Dynamics 365 project for Model B in the existing solution
- Right-click on the solution from Solution Explorer
- Add > New Project
- Assign the model to the project
- The solution should contain two projects, each representing a model in a package that to be built
- Add nuget.config and packages.config in Source Control Explorer to the folder that was created in the previous steps
- Check-in the solution, projects, and the above two config files
2.4 Azure DevOps Pipelines
2.4.1 Publish NuGet Packages
Each time when the build pipeline is run, we will need to make sure the right version of the F&O dependency such as the compiler tools, application references are installed in the Microsoft hosted agent. You can manually download the NuGet packages from Lifecycle Services (LCS) and publish them to the DevOps Artifacts feed through CLI or PowerShell.
Alternatively, you can also leverage the Azure Pipeline to automate this process. Following is an example of how to do this:
2.4.1.1 Create Azure Active Directory Application (Optional)
- Navigate to the Azure Portal > Azure Active Directory > Manage > App registrations
- Click 'New registration', provide a name, and click 'Register'
- Navigate to Manage > API permissions > Configured permissions > Add a permission
- Select 'Dynamics Lifecycle services' under 'APIs my organization uses'
- Add 'user_impersonation' permissions
- Click 'Grant admin consent'
- Navigate to Manage > Authentication
- In Advanced settings > Allow public client flows, configure 'Enable the following mobile and desktop flows' to 'Yes'
2.4.1.2 Create a Service Connection in Azure DevOps (Optional)
- Navigate to the Project Settings of the DevOps project > Pipelines > Service connections
- Click 'New service connection'
- Choose 'Dynamics Lifecycle Services'
- Enter the following details:
- Username: The LCS credential that has access to the Project and Asset Library
- Password: The LCS credential that has access to the Project and Asset Library
- Application ID: This is the App/Client ID that was created in the previous step
- Service connection name: Provide a meaningful name
- Click 'Save'
2.4.1.3 Create Azure Pipeline
You can choose to create the pipeline or use the following steps to import the pipeline. The pipeline contains only 2 steps:
- Dynamics LCS Asset Download
- NuGet push
- Navigate to DevOps project > Pipelines
- Click on the 'Three Dots' button next to 'New pipeline', and click 'Import a pipeline'
- Download and import this template: Azure-Pipelines/Dynamics 365 F&O - Publish NuGet Packages to Artifacts.json at main · kobonet/Azure-Pipelines (github.com)
- Enter the following details:
- Pipeline
- Agent pool: Azure Pipelines
- Agent Specification: vs2017-win2016
- Get sources
- Select a source: TFVC
- Workspace mappings: Select a server path which contains minimum files as this step is not actually required in this pipeline and could reduce pipeline execution time
- Download from LCS project
- LCS Connection: Select the service connection that was created from the previous step
- LCS Project Id: This is the project ID that can be found from the URL when you are in the LCS project
- Publish NuGet Packages to Artifacts
- Target feed: Select the feed that was created from the previous step
- Click 'Save & queue' to run the pipeline
- Pipeline
2.4.2 Build Automation
2.4.2.1 Create Azure Pipeline
- Navigate to DevOps project > Pipelines
- Click on the 'Three Dots' button next to 'New pipeline', and click 'Import a pipeline'
- Download and import this template: Dynamics365-Xpp-Samples-Tools/xpp-classic-ci.json at master · microsoft/Dynamics365-Xpp-Samples-Tools (github.com)
- Navigate to the Variables tab
- Change the NugetConfigsPath to the path where it contains the packages.config and nuget.config. In this example, we have these files stored in the BuildAutomation folder within the Main folder.
- You might have to remove and re-add the following tasks:
- Update Model Version
- Create Deployable Package
- X Tools Path: $(NugetsPath)\\$(ToolsPackage)
- Add Licenses to Deployable Package
- Disable this step if this is not applicable
- Enter the following details:
- Pipeline
- Agent pool: Azure Pipelines
- Agent Specification: vs2017-win2016
- Get sources
- Select a source: TFVC
- Workspace mappings: Set the 'Server path' to the branch that you would like the build to run, e.g. $/Project/Trunk/Main
- Build solution **\*.sln
- Solution: Navigate to the path of the Visual Studio Solution that was created in the previous step. This is how the build pipeline knows which packages to be built. It is also possible to use existing visual studio projects (e.g. Trunk/Main/Projects) instead of using a dedicated Visual Studio Solution, however the same packages might be built multiple times and would dramatically increase the build time.
- Pipeline
- Click 'Save & queue' to run the pipeline
For version 10.0.18 onwards
There is a new NuGet package being introduced from version 10.0.18 onwards which will require making changes to the packages.config and the build pipeline. For those who are trying to create the build pipeline for the first time, you may skip this step as you would automatically receive this changes when you download and import the pipeline template.
Update packages.config
Update Build Pipeline
- Navigate to DevOps project > Pipelines
- Select the build pipeline that was created in 2.4.2.1 and click 'Edit'
- Navigate to the Variables tab
- Click ' Add' and enter the following details:
- Name: AppSuitePackage
- Value: Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp
- Click ' Add' and enter the following details:
- Navigate to the Tasks tab
- Select 'Build solution' step
- Change the value of MSBuild Arguments to the following:
/p:BuildTasksDirectory="$(NugetsPath)\$(ToolsPackage)\DevAlm"
/p:MetadataDirectory="$(MetadataPath)"
/p:FrameworkDirectory="$(NuGetsPath)\$(ToolsPackage)"
/p:ReferenceFolder="$(NuGetsPath)\$(PlatPackage)\ref\net40;$(NuGetsPath)\$(AppPackage)\ref\net40;$(NuGetsPath)\$(AppSuitePackage)\ref\net40;$(MetadataPath);$(Build.BinariesDirectory)"
/p:ReferencePath="$(NuGetsPath)\$(ToolsPackage)"
/p:OutputDirectory="$(Build.BinariesDirectory)"
Click 'Save & queue ' > 'Save'
Conclusion
Whilst the steps above look a little complex because there are a lot of moving parts involved, it’s very straightforward to set this all up. A couple of points to remember:
- Consider the cost implication, doing a regular cleanup of the artifacts can lower your cost. Additional parallel runs of the pipelines will require purchasing add-on.
- Always make sure the following req-requisites are completed before executing the build pipeline:
- Validate and make sure the right version of F&O apps NuGet packages are published to Azure DevOps Artifact.
- If you are using a pipeline to automate this process, make sure you import the packages in LCS’s Asset Library before scheduling the pipeline to run.
- Modify and check-in the packages.config file with the version number used above. If you have multiple branches, you would have to maintain this config file individually.
- Validate and make sure the right version of F&O apps NuGet packages are published to Azure DevOps Artifact.
- If you are going to introduce a new package as part of your customization, make sure you are creating a new project in the build solution so the new package would be picked up as part of the build process.
Thank you for reading and please feel free to leave a comment below if you have any question or feedback!
Special thanks to José Antonio Estevan and Alessandro Colombini for their contribution on peer reviewing this blog post.
-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->-->
*This post is locked for comments