ariste.info
  • MSDyn365 & Azure DevOps ALM
  • Get my book!
  • Services
  • About me
    • About me
    • Events
    • Subscribe
    • Contact
  • Logo
  • LinkedIn

Unified experience ALM

  • Welcome to the new ALM guide
    • Introduction
    • Prerequisites
  • Create and prepare Azure DevOps organization and project
    • Create an Azure DevOps organization
    • Create an Azure DevOps project
    • Enable TFVC projects in Azure DevOps
    • Add CI/CD agents with an Azure subscription
  • Unified development environment
    • What are unified developer environments?
    • Transition to a capacity-based model
    • Deploying a unified developer environment using PowerShell
    • Deploy an UDE environment from PPAC
    • Upgrade version in a Unified Development Environment
    • Useful links
  • Using Visual Studio
    • Connect Visual Studio to your UDE
    • Deploy code to a Unified Development Environment
  • Pipelines
    • What’s new in the pipelines?
    • YAML Pipelines
    • YAML Structure

Legacy Guide

  • Welcome
    • Welcome!
  • Dynamics 365 for Finance & Operations & Azure DevOps
    • Intro
    • Package and model planning
    • Azure DevOps
    • First steps
    • The build server
    • Visual Studio
    • Some advice
    • Branching strategies
  • Azure Pipelines
    • Builds
    • Continuous integration
    • Gated check-ins
    • Set up the new Azure DevOps tasks for Packaging and Model Versioning
  • Azure hosted build for Dynamics 365 Finance & SCM
    • Intro
    • Azure agents
    • How does it work?
    • What do I need?
    • Azure DevOps artifacts
    • Prepare Azure DevOps
    • Configure pipeline
    • Update for version 10.0.18 and greater
  • Azure DevTest Labs powered builds
    • Intro
    • Azure DevTest Labs
    • Getting and preparing the VHD
    • Create a DevTest Labs account
    • Creating the VM
    • Preparing the VM
    • Create a new image
    • Azure DevOps pipelines
    • Run the build
    • Times
    • Show me the money
    • Some final remarks
  • Add and build .NET projects
    • Intro
    • Build .NET in your pipeline
    • Add a C# project to FnO
    • Build pipelinebui
    • Things I don’t like/understand/need to investigate
  • Release Pipelines
    • Intro
    • Setting up Release Pipeline in Azure DevOps for Dynamics 365 for Finance and Operations
    • AAD app creation
    • Create the release pipeline in DevOps
    • Creating the LCS connection
    • New Azure DevOps release tasks: MSAL authentication and ADAL deprecation
  • Automation
    • Update VMs using pipelines and d365fo.tools
    • Builds
    • Releases
    • But I like to add some human touch to it
    • Extra bonus
    • Update a variable in a release
  • LCS DB API
    • Call the LCS Database Movement API from your Azure DevOps Pipelines
    • Automating Prod to Dev DB copies
    • Restore a data package with Azure DevOps Pipelines
  • Secure your Azure Pipelines with Azure Key Vault
    • Azure Key Vault
    • Securing your Azure DevOps Pipelines
View Categories
  • ariste.info
  • Dynamics 365 F&O Dev ALM guide
  • Unified experience ALM
  • Pipelines
  • YAML Structure

YAML Structure

So, how does a YAML pipeline for F&O look like? Not so fast, first a brief intro to the YAML format.

YAML structure 101 #

YAML (short for “YAML Ain’t Markup Language”) is a human-readable data serialization format commonly used for configuration files and data exchange between languages.

Unlike formats such as JSON or XML, YAML is designed to be easy to write and understand at a glance.

YAML’s structure is based on indentation rather than brackets or braces, making it clean and intuitive to read. Here’s a quick breakdown of its main elements:

Key–value pairs: The basic building blocks. Keys and values are separated by a colon and a space:

name: Alice
age: 30

Nested data: Indentation (usually two spaces per level) is used to represent hierarchy:

person:
  name: Alice
  age: 30

Lists: Represented by a dash and a space:

fruits:
  - apple
  - banana
  - mango

Comments: Start with a hash and continue to the end of the line:

# This is a comment

Pipeline structure #

Now that we know a bit about YAML, here’s how a sample CI pipeline looks:

# Sample YML pipeline for X++ builds
# For more information on build pipelines, see
# https://docs.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/dev-tools/hosted-build-automation

# Change the name of the build to a 4-digit version number to be used for model versioning
name: $(Date:yy.MM.dd)$(Rev:.r)
trigger:
  branches:
    include:
      - main
  paths:
    include:
      - Metadata/**

pool:
  vmImage: 'windows-latest'
  demands:
    - msbuild
    - visualstudio
  
# Declare some shorthand for NuGet package names
# Make editing the path for metadata and NuGet extraction folder easier
variables:
  App1Package: 'Microsoft.Dynamics.AX.Application1.DevALM.BuildXpp'
  App2Package: 'Microsoft.Dynamics.AX.Application2.DevALM.BuildXpp'
  AppSuitePackage: 'Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp'
  PlatPackage: 'Microsoft.Dynamics.AX.Platform.DevALM.BuildXpp'
  ToolsPackage: 'Microsoft.Dynamics.AX.Platform.CompilerPackage'
  MetadataPath: '$(Build.SourcesDirectory)\Metadata'
  NugetConfigsPath: '$(Build.SourcesDirectory)\Tools\Build'
  NugetsPath: '$(Pipeline.Workspace)\NuGets'

steps:
# Install NuGet and use -ExcludeVersion option to avoid paths containing version numbers
- task: NuGetCommand@2
  displayName: 'NuGet custom install Packages'
  inputs:
    command: custom
    arguments: 'install -Noninteractive $(NugetConfigsPath)\packages.config -ConfigFile $(NugetConfigsPath)\nuget.config -Verbosity Detailed -ExcludeVersion -OutputDirectory "$(NugetsPath)"'

# Build using MSBuild 16 (VS 2019)
# Provide the needed paths, including semi-colon separated list of reference folders
# /p:ReferenceFolder are metadata folders containing other (compiled) X++ packages that are referenced
# /p:ReferencePath are folders containing non-X++ assemblies referenced (aside from one already in the output folder for the package)
- task: VSBuild@1
  displayName: 'Build solution **\*.sln'
  inputs:
    solution: '**/*.sln'
    vsVersion: '17.0'
    msbuildArgs: '/p:BuildTasksDirectory="$(NugetsPath)\$(ToolsPackage)\DevAlm" /p:MetadataDirectory="$(MetadataPath)" /p:FrameworkDirectory="$(NuGetsPath)\$(ToolsPackage)" /p:ReferenceFolder="$(NuGetsPath)\$(PlatPackage)\ref\net40;$(NuGetsPath)\$(App1Package)\ref\net40;$(NuGetsPath)\$(App2Package)\ref\net40;$(NuGetsPath)\$(AppSuitePackage)\ref\net40;$(MetadataPath);$(Build.BinariesDirectory)" /p:ReferencePath="$(NuGetsPath)\$(ToolsPackage)" /p:OutputDirectory="$(Build.BinariesDirectory)"'

This is based on the available sample in Microsoft’s GitHub project for X++ samples and tools.

Let’s start from the top. First we have the name of the pipeline, and after it a trigger:

# Change the name of the build to a 4-digit version number to be used for model versioning
name: $(Date:yy.MM.dd)$(Rev:.r)
trigger:
  branches:
    include:
      - main
  paths:
    include:
      - Metadata/**

You can see that after the trigger keyword we have elements that are indented. All that’s indented after a keyword is like a subsection of it.

In this case we have branches, and only include the main one. We’ve also got a path, Metadata , to avoid triggering the pipeline each time some commit in the main branch happens.

Then we have the pool:

pool:
  vmImage: 'windows-latest'
  demands:
    - msbuild
    - visualstudio

This is similar to what we had on classic pipelines under Agent job, to define which agent version and requisites we have.

Next we have variables:

# Declare some shorthand for NuGet package names
# Make editing the path for metadata and NuGet extraction folder easier
variables:
  App1Package: 'Microsoft.Dynamics.AX.Application1.DevALM.BuildXpp'
  App2Package: 'Microsoft.Dynamics.AX.Application2.DevALM.BuildXpp'
  AppSuitePackage: 'Microsoft.Dynamics.AX.ApplicationSuite.DevALM.BuildXpp'
  PlatPackage: 'Microsoft.Dynamics.AX.Platform.DevALM.BuildXpp'
  ToolsPackage: 'Microsoft.Dynamics.AX.Platform.CompilerPackage'
  MetadataPath: '$(Build.SourcesDirectory)\Metadata'
  NugetConfigsPath: '$(Build.SourcesDirectory)\Tools\Build'
  NugetsPath: '$(Pipeline.Workspace)\NuGets'

In classic pipelines, we have variables too. Here we define values that are reused later in the pipeline.

And finally, we have steps. And the steps contain tasks, like a classic pipeline again. The tasks are the individual processes that happen during a build. For example, update the model version, install NuGet packages, build code, or create the deployable package.

steps:
# Install NuGet and use -ExcludeVersion option to avoid paths containing version numbers
- task: NuGetCommand@2
  displayName: 'NuGet custom install Packages'
  inputs:
    command: custom
    arguments: 'install -Noninteractive $(NugetConfigsPath)\packages.config -ConfigFile $(NugetConfigsPath)\nuget.config -Verbosity Detailed -ExcludeVersion -OutputDirectory "$(NugetsPath)"'

# Build using MSBuild 16 (VS 2019)
# Provide the needed paths, including semi-colon separated list of reference folders
# /p:ReferenceFolder are metadata folders containing other (compiled) X++ packages that are referenced
# /p:ReferencePath are folders containing non-X++ assemblies referenced (aside from one already in the output folder for the package)
- task: VSBuild@1
  displayName: 'Build solution **\*.sln'
  inputs:
    solution: '**/*.sln'
    vsVersion: '17.0'
    msbuildArgs: '/p:BuildTasksDirectory="$(NugetsPath)\$(ToolsPackage)\DevAlm" /p:MetadataDirectory="$(MetadataPath)" /p:FrameworkDirectory="$(NuGetsPath)\$(ToolsPackage)" /p:ReferenceFolder="$(NuGetsPath)\$(PlatPackage)\ref\net40;$(NuGetsPath)\$(App1Package)\ref\net40;$(NuGetsPath)\$(App2Package)\ref\net40;$(NuGetsPath)\$(AppSuitePackage)\ref\net40;$(MetadataPath);$(Build.BinariesDirectory)" /p:ReferencePath="$(NuGetsPath)\$(ToolsPackage)" /p:OutputDirectory="$(Build.BinariesDirectory)"'

Stages #

Pipelines can also be broken into higher-level sections called stages.

A stage is a logical phase of your pipeline. For example, build and deploy would be two separate stages with different tasks each. Each stage can have its own jobs and steps, and you can control the order they run in.

Why is this useful for us? In Finance & Operations projects, we usually don’t just build X++! We also want to create a deployable package and deploy it to UAT and production.

Stages let you separate those concerns cleanly:

  • stages: is at the top level of the file, just like name, trigger, and pool.
  • Each stage has one or more jobs. Each job has steps.
  • We can make later stages depend on earlier stages with dependsOn, and even gate them with condition.
  • We can publish artifacts from one stage and consume them in the next. That’s how you go from “I built code” to “here’s a deployable package you can release”.

In short, with stages we get a mini lifecycle: build it, package it, deploy it, each phase isolated and controlled.

This is the structure you normally want for real F&O CI/CD.

Subscribe! #

Receive an email when a new post is published
What are your Feelings

Share This Article :

  • Facebook
  • X
  • LinkedIn
  • Pinterest
Still stuck? How can I help?

How can I help?

YAML Pipelines

Write A Comment Cancel Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Table of Contents
  • YAML structure 101
  • Pipeline structure
  • Stages
  • Dynamics 365 Community
  • Subscribe!
  • Microsoft MVP
  • LinkedIn
  • Privacy Policy
  • Contact
  • About me

© 2024 ariste.info. Designed by CazaPelusas.

Top

    Type above and press Enter to search. Press Esc to cancel.