MSDyn365 & Azure DevOps ALM

Intro

X++ developers have been working without a version-control system for most of our careers. We had MorphX VCS for AX 2009 and the option to use TFVC in AX 2009 and AX 2012 but it wasn’t mandatory. Actually, and always from my experience, most of the projects used no VCS other than comments in the code. I’m not saying all, but in 10 years I’ve seen only one AX 2009 project using it.

If we told this to a non-X++ software developer, he would think we’re nuts and probably being foolish by not using a VCS. Who would risk at losing their ongoing work due to a stupid mistake? Because we were that, do you know someone who lost all the work done in a day after pressing the wrong button? I’m sure you do, I do!

El AOT antes de la llegada del control de versiones
The AOT before source control, by cazapelusas.com

One of the major changes we got with Finance and Operations has been the mandatory use of a version-control system.

What follows here is the product of several blog posts written during over a year at ariste.info. I’ve rewritten the content to adequate it to the changes and new features we’ve gotten in Azure DevOps for Microsoft Dynamics 365 FnO and tried to reorder it for an easier reading.

I will also try to keep this document updated if something changes, but I can’t guarantee that. If you find an error, you can contact me at adria (at) ariste (dot) info.

I hope this guide helps all the developers out there. We need a strong developer community which can make a correct use of these tools to make our Dynamics 365 projects successful.

Contents hide

Dynamics 365 for Finance & Operations and Azure DevOps

Azure DevOps

Azure DevOps will be the service we will use for source control. Microsoft Dynamics 365 for Finance and Operations supports TFVC out of the box as its version-control system.

But Azure DevOps does not only offer a source control tool. Of course, developers will be the most benefited of using it, but from project management to the functional team and customers, everybody can be involved in using Azure DevOps. BPM synchronization and task creation, team planning, source control, automated builds and releases, are some of the tools it offers. All these changes will need some learning from the team, but in the short-term all of this will help to better manage implementations.

As I said it looks like the technical team is the most affected by the addition of source control to Visual Studio, but it’s the most benefited too…

First steps

To use all the features described in this guide we need to create an Azure DevOps project and connect it to LCS. This will be the first step and it’s mandatory so let’s see how we have to do everything.

Create an Azure DevOps organization

You might or might not have to do this. If you or your customer already have an account, you can use it and just create a new project in it. Otherwise head to https://dev.azure.com and create a new organization:

Azure DevOps sign up
Azure DevOps sign up

After creating it you need to create a new project with the following options:

Create Azure DevOps project
Create Azure DevOps project

Press the “Create project” button and you’re done. Now let’s connect this Azure DevOps project to our LCS project.

When a customer signs up for Finance and Operations the LCS project is of type “Implementation project” is created automatically. Your customers need to invite you to their project. If you’re an ISV you can use the “Migrate, create solutions, and learn” projects.

In any of both cases you need to go to “Project settings” and select the “Visual Studio Team Services” Tab. Scroll down and you should see two fields. Fill the field with your DevOps URL without the project part. If you got a https://dev.azure.com/YOUR_ORG URL type you need to change it to https://YOUR_ORG.visualstudio.com:

Azure DevOps setup on LCS
Azure DevOps setup on LCS

And to get the “Personal access token” we go back to our Azure DevOps project, click on the user settings icon, and then select “Personal access tokens”:

MSDyn365 & Azure DevOps ALM 1

We add a new token, set its expiration and give it full access. Finally press the “Create” button and a new dialog will appear with your token, copy it, and paste it in LCS.

Azure DevOps personal token
Azure DevOps personal token

Back to LCS, once you’ve pasted the token press the “Continue” button. On the next step just select your project, press “Continue” and finally “Save” on the last step.

If you have any problem you can take a look at the docs where everything is really well documented.

The build server

Once we’ve linked LCS and Azure DevOps we’ll have to deploy the build server. This will be the heart of our CI/CD processes.

Even though the build virtual machine has the same topology as a developer box, it really isn’t a developer VM and should never be used as one, do not use it as a developer VM! It has Visual Studio installed in it, the AosService folder with all the standard packages and SQL Server with an AxDB, just like all other developer machines, but that’s not its purpose.

We won’t be using any of those features. The “heart” of the build machine is the build agent, an application which Azure DevOps uses to execute the build definition’s tasks from Azure DevOps.

We can also use Azure hosted build agents. Azure hosted agents allow us to run a build without a VM, the pipeline runs on Azure. This feature is currently in public preview.

The build VM

This VM is usually the dev box on Microsoft’s subscription but you can also use a regular cloud-hosted environment as a build VM.

When this VM is deployed there’s two things happening: the basic source code structure and the default build definition are created.

MSDyn365 & Azure DevOps ALM 2
MSDyn365 & Azure DevOps ALM 3

Visual Studio

We have the basics to start working. Log into your dev VM and start Visual Studio, we must map the Main folder to the development machine’s packages folder. Open the team explorer and select “Connect to a Project…”:

MSDyn365 & Azure DevOps ALM 4

It will ask for your credentials and then show all projects available with the account you’ve used. Select the project we have created in the steps earlier and click on “Connect”:

MSDyn365 & Azure DevOps ALM 5

Now open the “Source Control Explorer”, select the Main folder and click on the “Not mapped” text:

MSDyn365 & Azure DevOps ALM 6

Map the Main folder to the K:\AosService\PackagesLocalDirectory folder on your service drive (this could be drive C if you’re using a local VM instead of a cloud-hosted environment):

MSDyn365 & Azure DevOps ALM 7

What we’ve done in this step is telling Visual Studio that what’s in our Azure DevOps project, inside the Main folder, will go into the K:\AosService\PackagesLocalDirectory folder of our development VM.

The Main folder we have in our source control tree is a regular folder, but we can convert it into a branch if we need it.

MSDyn365 & Azure DevOps ALM 8
MSDyn365 & Azure DevOps ALM 9

In the image above, you can see the icon for Main changes when it’s converted to a branch. Branches allow us to perform some actions that aren’t available to folders. Some differences can be seen in the context menu:

Menú contextual carpeta
Folder context menu
Menú contextual rama
Branch context menu

For instance, branches can display the hierarchy of all the project branches (in this case it’s only Main and Dev so it’s quite simple).

Jerarquía de las ramas

Properties dialogs are different too. The folder one:

MSDyn365 & Azure DevOps ALM 10

And the branch one, where we can see the different relationships between the other branches created from Main:

Propiedades de la rama

This might be not that interesting or useful, but one of the things converting a folder into a branch is seeing where has a changeset been merge into.

Some advice

I strongly recommend moving the Projects folder out of the Main branch (or whatever you call it) into the root of the project, at the same level as BuildProcessTemplates and Trunk. In fact, and this is my personal preference, I would keep anything that’s not code outside of a branch. By doing this you only need to take care of the code when merging and branching.

Those who have been working with AX for several years were used to not use version-control systems. MSDyn365FO has taken us to uncharted territory, so it is not uncommon for different teams to work in different ways, depending on their experience and what they’ve found in the path. Each team will need to invest some time to discover what’s better for them regarding code, branching and methodologies. Many times, this will be based on experimentation and trial and error, and with the pace of implementation projects trial and error turns out bad.

Branching strategies

I want to make it clear in advance: I’m not an expert in managing code nor Azure DevOps, at all. All that I’ve written here is product of my experience (good and bad) of over 4 years working with Finance and Operations. In this article on branching strategies from the docs there’s more information regarding branching and links to articles of the DevOps team. And there’s even more info in the DevOps Rangers’ Library of tooling and guidance solutions!

Main-Release

One possible strategy is using a Main and a Release branch. We have already learnt that the Main branch is created when the Build VM is deployed. The usual is that in an implementation project all development will be done on that branch until the Go Live, and just before that a new Release branch will be created.

We will keep development work on the Main branch, and when that passes validation, we’ll move it to Release. This branching strategy is really simple and will keep us mostly worry-free.

Dev – Main – Release

This strategy is similar to the Main – Release one but includes a Dev branch for each developer. This dev branch must be maintained by the developer using it. He can do as many check-ins as he wants during a development, and when it’s done merge all these changes to the Main branch in a single changeset. Of course, this adds some bureaucracy because we also need to forward integrate changes from Main into our Dev branch, but it will allow us to have a cleaner list of changesets when merging them from Main to the Release branch.

Whatever branching strategy you choose try to avoid having pending changesets to be merged for a long time. The amount of merge conflicts that will appear is directly proportional to the time the changeset has been waiting to be merged.

I wrote all of this based on my experience. It’s obviously not the same working for an ISV than for an implementation partner. An ISV has different needs, it must maintain different code versions to support all their customers and they don’t necessarily need to work in a Main – Release manner. They could have one (or more) branch for each version. However, since the end of overlayering this is not necessary. More ideas about this can be found in the article linked at the beginning.

Azure Pipelines

Builds

We’ve already seen that the default build definition has all the default steps active. We can disable (or remove) all the steps we’re not going to use. For example, the testing steps can be removed if we have no unit testing. We can also create new build definitions from scratch, however it’s easier to clone the default one and modify it to other branches or needs.

Since version 8.1 all the X++ hotfixes are gone, the updates are applied in a single deployable package as binaries. This implies that the source-controlled Metadata folder will only contain our custom packages and models, no standard packages anymore.

Continuous Integration

Continuous Integration (CI) is the process of automating the build and testing of code every time a team member commits changes to version control. (source)

Should your project/team use CI? Yes, yes, yes. This is one of the key feature of using an automated build process.

This is how a build definition for CI that will only compile our codebase looks like:

Definicion build continua

Only the prepare and build steps. Then we need to go to the “Triggers” tab and enable the CI option:

DevOps continuous integration

Right after each developer check-in, a build will be queued, and the code compiled. In case there’s a compilation error we’ll be notified about it. Of course, we all build the solutions before checking them in and don’t need this CI build. Right?

tysonjaja

And because we all know that “Slow and steady wins the race”, but at some point during a project that’s not possible, so this kind of build definition can help us out. Especially when merging code between branches. This will allow us to be 100% sure when creating a DP to release to production that it’ll work. I can tell you that having to do a release to prod in a hurry and seeing the Main build failing is not nice.

Gated check-ins

A gated check-in is a bit different than a CI build. The gated check-in will trigger an automated build BEFORE checking-in the code. If it fails, the changeset is not cheked-in until the errors are fixed and checked-in again.

This option might seem perfect for the merge check-ins to the Main branch. I’ve found some issues trying to use it, for example:

  • If multiple merges & check-ins from the same development are done and the first fails but the second doesn’t, you’ll still have pending merges to be done. You can try batching the builds, but I haven’t tried that.
  • Issues with error notifications and pending code on dev VMs.
  • If many check-ins are made, you’ll end up with lots of queued builds (and we only have one available agent per DevOps project). This can also be solved using the “Batch changes while a build is in progress”.

I think the CI option is working perfectly to validate code. As I’ve already said several times, choose the strategy that better suits your team and your needs. Experiment with CI and Gated check-in builds and decide what is better for you.

Set up the new Azure DevOps tasks for Packaging and Model Versioning

Almost all the tasks of the default build definition use PowerShell scripts that run on the Build VM. We can change 3 of those steps for newer tasks. In order to use these newer tasks, we need to install the “Dynamics 365 Unified Operations Tools”. We’ll be using them to set up our release pipeline too so consider doing it now.

Update Model Version task

This one is the easiest, just add it to your build definition under the current model versioning task, disable the original one and you’re done. If you have any filters in your current task, like excluding any model, you must add the filter in the Descriptor Search Pattern field using Azure DevOps pattern syntax.

Create Deployable Package task

This task will replace the Generate packages from the current build definitions. To set it up we just need to do a pair of changes to the default values:

X++ Tools Path
MSDyn365 & Azure DevOps ALM 11

This is your build VM’s physical bin folder, the AosService folder is usually on the unit K for cloud-hosted VMs. I guess this will change when we go VM-less to do the builds.

Update!: the route to the unit can be changed for $(ServiceDrive), getting a path like $(ServiceDrive)\AOSService\PackagesLocalDirectory\bin.

Location of the X++ binaries to package
MSDyn365 & Azure DevOps ALM 12

The task comes with this field filled in as $(Build.BinariesDirectory) but this didn’t work out for our build definitions, maybe the variable isn’t set up on the proj file. After changing this to $(Agent.BuildDirectory)\Bin the package is generated.

Filename and path for the deployable package
MSDyn365 & Azure DevOps ALM 13

The path on the image should be changed to $(Build.ArtifactStagingDirectory)\Packages\AXDeployableRuntime_$(Build.BuildNumber).zip. You can leave it without the Packages folder in the path, but if you do that you will need to change the Path to Publish field in the Publish Artifact: Package step of the definition.

Add Licenses to Deployable Package task

This task will add the license files to an existing Deployable Package. Remember that the path of the deployable package must be the same as the one in the Create Deployable Package task.

Azure hosted build for Dynamics 365 Finance & SCM

The day we’ve been waiting for has come! The Azure hosted builds are in public preview since PU35!! We can now stop asking Joris when will this be available, because it already is! Check the docs!

I’ve been able to write this because, thanks to Antonio Gilabert, we’ve been testing this at Axazure for a few months with access to the private preview. And of course thanks to Joris for inviting us to the preview!

Azure hosted build
Riding the Azure Pipelines by Caza Pelusas

What does this mean? We no longer need a VM to run the build pipelines! Nah, we still need! If you’re running tests or synchronizing the DB as a part of your build pipeline you still need the VM. But we can move CI builds to the Azure hosted agent!

You can also read my full guide on MSDyn365FO & Azure DevOps ALM.

Remember this is a public preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited you should see a new LCS project called PEAP Assets, and inside its Asset Library you’ll find the nugets in the Nuget package section.

Azure agents

With the capacity to run an extra Azure hosted build we get another agent to run a pipeline, and can run multiple pipelines at the same time. But it still won’t be parallel pipelines, because we only get one VM-less agent. This means we can run a self-hosted and azure hosted pipeline at the same time, but we cannot run two of the same type in parallel. If we want that we need to purchase extra agents.

With a private Azure DevOps project we get 2GB of Artifacts space (we’ll see that later) and one self-hosted and one Microsoft hosted agent with 1800 free minutes:

08CEA665 618A 4F15 B9EC F86A405FA7D8
Azure hosted build: Azure DevOps project pricing

We’ll still keep the build VM, so it’s difficult to tell a customer we need to pay extra money without getting rid of its cost. Plus we’ve been doing everything with one agent until now and it’s been fine, right? So take this like extra capacity, we can divide the build between both agents and leave the MS hosted one for short builds to squeeze the 1800 free minutes as much as possible.

How does it work?

There’s really no magic in this. We move from a self-hosted agent in the build VM to a Microsoft-hosted agent.

The Azure hosted build relies on nuget packages to compile our X++ code. The contents of the PackagesLocalDirectory folder, platform and the compiler tools have basically been put into nugets and what we have in the build VM is now on 3 nugets.

When the build runs it downloads & installs the nugets and uses them to compile our code on the Azure hosted build along the standard packages.

What do I need?

To configure the Azure hosted build we need:

  • The 3 nuget packages from LCS: Compiler tools, Platform X++ and Application X++.
  • A user with rights at the organization level to upload the nugets to Azure DevOps.
  • Some patience to get everything running 🙂

So the first step is going to the PEAP LCS’ Asset Library and downloading the 3 nuget packages:

Nugets for the Azure Hosted Build
Nugets for the Azure Hosted Build

Azure DevOps artifact

All of this can be done on your PC or in a dev VM, but you’ll need to add some files and a VS project to your source control so you need to use the developer box for sure.

Head to your Azure DevOps project and go to the Artifacts section. Here we’ll create a new feed and give it a name:

Azure DevOps artifact feed
Azure DevOps artifact feed

You get 2GB for artifacts, the 3 nuget packages’ size is around 500MB, you should have no issues with space unless you have other artifacts in your project.

Now press the “Connect to feed” button and select nuget.exe. You’ll find the instructions to continue there but I’ll explain it anyway.

Then you need to download nuget.exe and put it in the Windows PATH. You can also get the nugets and nuget.exe in the same folder and forget about the PATH. Up to you. Finally install the credential provider: download this Powershell script and run it.

Create a new file called nuget.config in the same folder where you’ve downloaded the nugets. It will have the content you can see in the “Connect to feed” page, something like this:

This file’s content has to be exactly the same as what’s displayed in your “Connect to feed” page.

And finally, we’ll push (upload) the nugets to our artifacts feed. We have to do this for each one of the 3 nugets we’ve downloaded:

You’ll get prompted for the user. Remember it needs to have enough rights on the project.

Of course, you need to change “AASBuild” for your artifact feed name. And we’re done with the artifacts.

Prepare Azure DevOps

This new agent needs a solution to build our packages. This means we have to create an empty solution in Visual Studio and set the package of the project to our main package. Like this:

2020 04 24 14 20 58
Visual Studio solution

If you have more than one package or models, you need to add a project to this solution for each separate model you have.

We have to create another file called packages.config with the following content:

The version tag will depend on when you’re reading this, but the one above is the correct one for PU35. We’ll need to update this file each time a new version of the nugets is published.

And, to end with this part, we need to add the solution, the nuget.config and the packages.config files to TFVC. This is what I’ve done:

2020 04 24 14 29 01
Azure DevOps

You can see I’ve created a Build folder in the root of my DevOps project. That’s only my preference, but I like to only have code in my branches, even the projects are outside of the branches, I only want the code to move between merges and branches. Place the files and solution inside the Build folder (or wherever you decide).

Configure pipeline

Now we need to create a new pipeline, you can just import this template from the newly created X++ (Dynamics 365) Samples and Tools Github project. After importing the template we’ll modify it a bit. Initially, it will look like this:

2020 04 24 14 35 07 1
Azure hosted build: Default imported pipeline

As you can see the pipeline has all the steps needed to generate the DP, but some of them, the ones contained in the Dynamics 365 tasks, won’t load correctly after the import. You just need to add those steps to your pipeline manually and complete its setup.

Pipeline root

2020 04 24 14 38 27

You need to select the Hosted Azure Pipelines for the Agent pool, and vs2017-win2016 as Agent Specification.

Get sources

DevOps mappings
Azure hosted build: Our mappings

I’ve mapped 2 things here: our codebase in the first mapping and the Build folder where I’ve added the solution and config files. If you’ve placed these files inside your Metadata folder you don’t need the extra mapping.

NuGet install Packages

This step gets the nugets from our artifacts feeds and the installs to be used in each pipeline execution.

2020 04 25 12 41 47 1
Azure hosted build: nuget install

The command uses the config files we have uploaded to the Build folder, and as you can see it’s fetching the files from the $(build.sourcesDirectory)\Build directory we’ve configured in the Get sources step. If you’ve placed those files in a diferent place you need to change the paths as needed.

Update Model Version

This is one of the steps that are displaying issues even though I got the Dynamics 365 tools installed from the Azure DevOps marketplace. If you got it right you probably don’t need to change anything. If you have the same issue as me, just add a new step and select the “Update Model Version” task and change the fields so it looks like this:

Update Model Version
Azure hosted build: Update Model Version

Create Deployable Package

This is another one of the steps that are not loading correctly for me. Again, add it and change as needed:

2020 04 24 14 55 32
Azure hosted build: Create Deployable Package

Add Licenses to Deployable Package

Another step with issues. Do the same as with the others:

2020 04 24 14 57 35
Azure hosted build: Add Licenses to Deployable Package

And that’s all. You can queue the build to test if it’s working. For the first runs you can disable the steps after the “Build solution” one to see if the nugets are downloaded correctly and your code built. After that try generating the DP and publishing the artifact.

You’ve configured your Azure hosted build, now it’s your turn to decide in which cases will you use the self-hosted or the azure hosted build.

Setup Release Pipelines

We’ve seen how the default build definition is created and how we can modify it. Now we’ll see how to configure our release pipelines!

The release pipelines allow us to automatically deploy our Deployable Packages to a Tier 2+ environment. This is part of the Continuous Delivery (CD) strategy. We can only do this for the UAT environments, it’s not possible to automate the deployment to the production environment.

Setting up Release Pipeline in Azure DevOps for Dynamics 365 for Finance and Operations

To configure the release pipeline, we need:

  • AAD app registration
  • LCS project
  • An Azure DevOps project linked to the LCS project above
  • A service account

I recommend a service account to do this, with a non-expiring password and no MFA enabled. It must have enough privileges on LCS, Azure and Azure DevOps too. This is not mandatory and can be done even with your user (if it has enough rights) for testing purposes, but if you’re setting this up don’t use your user and go for a service account.

AAD app creation

The first step to take is creating an app registration on Azure Active Directory to upload the generated deployable package to LCS. Head to Azure portal  and once logged in go to Azure ActiveDirectory, then App Registrations and create a new Native app:

MSDyn365 & Azure DevOps ALM 14

Next go to “Settings” and “Required permissions” to add the Dynamics Lifecycle Services API:

MSDyn365 & Azure DevOps ALM 15

In the dialog that will open change to the “APIs my organization uses” tab and select “Dynamics Lifecycle Services”:

MSDyn365 & Azure DevOps ALM 16

Select the only available permission in the next screen and click on the “Add permissions” button. Finally press the “Grant admin consent” button to apply the changes. This last step can be easily forgotten and the package upload to LCS cannot be done if not granted. Once done take note of the Application ID, we’ll use it later.

Create the release pipeline in DevOps

Go to Azure DevOps, and to Pipelines -> Releases to create the new release. Select “New release pipeline” and choose “Empty job” from the list.

On the artifact box select the build which will be used for this release definition:

New release

Pick the build definition you want to use for the release in “Source”, “Latest” in “Default version” and push “Add”.

The next step we’ll take is adding a Task with the release pipeline for Dynamics. Go to the Tasks tab and press the plus button. A list with extension will appear, look for “Dynamics 365 Unified Operations Tools”:

Dynamics 365 Unified Operations Tools

If the extension hasn’t been added previously it can be done in this screen. In order to add it, the user used to create the release must have admin rights on the Azure DevOps account, not only in the project in which we’re creating the pipeline.

When the task is created we need to fill some parameters:Release Dynamics Operations

Creating the LCS connection

The first step in the task is setting up the link to LCS using the AAD app we created before. Press New and let’s fill the fields in the following screen:

MSDyn365 & Azure DevOps ALM 17

It’s only necessary to fill in the connection name, username, password (from the user and Application (Client) ID fields. Use the App ID we got in the first step for the App ID field. The endpoint fields should be automatically filled in. Finally, press OK and the LCS connection is ready.

In the LCS Project Id field, use the ID from the LCS project URL, for example in https://lcs.dynamics.com/V2/ProjectOverview/1234567 the project is is 1234567.

Press the button next to “File to upload” and select the deployable package file generated by the build:

DP Generado

If the build definition hasn’t been modified, the output DP will have a name like AXDeployableRuntime_VERSION_BUILDNUMBER.zip. Change the fixed Build Number for the DevOps variable $(Build.BuildNumber) like in the image below:

BUildNumber

The package name and description in LCS are defined in “LCS Asset Name” and “LCS Asset Description”. For these fields, Azure DevOps’ build variables and release variables can be used. Use whatever fits your project, for example a prefix to distinguish between prod and pre-prod packages followed by $(Build.BuildNumber), will upload the DP to LCS with a name like Prod 2019.1.29.1, using the date as a DP name.

Save the task and release definition and let’s test it. In the Releases select the one we have just created and press the “Create a release” button, in the dialog just press OK. The release will start and, if everything is OK we’ll see the DP in LCS when it finishes:

LCS Asset Library

The release part can be automated, just press the lightning button on the artifact and enable the trigger:

Release trigger

And that’s all! Now the build and the releases are both configured. Once the deployment package is published the CI scenario will be complete.

More automation!

I’ve already explained in the past how to automate the builds, create the CI builds and create the release pipelines on Azure DevOps, what I want to talk about in this post is about adding a little bit more automation.

Builds

In the build definition go to the “Triggers” tab and enable a scheduled build:

MSDyn365 & Azure DevOps ALM 18

This will automatically trigger the build at the time and days you select. In the example image, every weekday at 16.30h a new build will be launched. But everyday? Nope! What the “Only schedule builds if the source or pipeline has changed” checkbox below the time selector makes is only triggering the build if there’s been any change to the codebase, meaning that if there’s no changeset checked-in during that day no build will be triggered.

Releases

First step done, let’s see what can we do with the releases:

MSDyn365 & Azure DevOps ALM 19

The release pipeline in the image above is the one that launches after the build I’ve created in the first step. For this pipeline I’ve added the following:

MSDyn365 & Azure DevOps ALM 20

The continuous deployment trigger has been enabled, meaning that after the build finishes this release will be automatically run. No need to define a schedule but you could also do that.

MSDyn365 & Azure DevOps ALM 21

As you can see, the schedule screen is exactly the same as in the builds, even the changed pipeline checkbox is there.  You can use any of these two approaches, CD or scheduled release, it’s up to your project or team needs.

With these two small steps you can have your full CI and CD strategy automatized and update a UAT environment each night to have all the changes done during that day ready for testing, with no human interaction!

But I like to add some human touch to it

If you don’t like not knowing if an environment is being updated… well that’s IMPOSSIBLE because LCS will SPAM you to make sure you know what’s going on. But if you don’t want to be completely replaced by robots you can add approvals to your release flow:

MSDyn365 & Azure DevOps ALM 22

Clicking the left lightning + person button on your release you can set the approvers, a person or a group (which is quite practical), and the kind of approval (all or single approver) and the timeout. You will also receive an email with a link to the approval form:

MSDyn365 & Azure DevOps ALM 23

And you can also postpone the deployment! Everything is awesome!

Extra bonus!

A little tip. Imagine you have the following release:

MSDyn365 & Azure DevOps ALM 24

This will update 3 environments, but will also upload the same Deployable Package three times to LCS. Wouldn’t it be nice to have a single upload and that all the deployments used that file? Yes, but we can’t pass the output variable from the upload to other stages 🙁 Yes that’s unfortunately right. But we can do something with a little help from our friend Powershell!

Update a variable in a release

What we need to do is create a variable in the release definition and set its scope to “Release”:

MSDyn365 & Azure DevOps ALM 25

Then, for each stage, we need to enable this checkbox in the agent job:

MSDyn365 & Azure DevOps ALM 26

I explain later why we’re enabling this. We now only need to update this variable after uploading the DP to LCS. Add an inline Powershell step after the upload one and do this:

You need to change the following:

  • Line 2: $assetId= “$(GoldenUpload.FileAssetId)”. Change $(GoldenUpload.FileAssetId) for your output variable name.
  • Line 6: $ReleaseVariableName = ‘axzfileid’. Change axzfileid for your Release variable name.

And you’re done. This script uses Azure DevOps’ REST API to update the variable value with the file id, and we enabled the OAuth token checkbox to allow the usage of this API without having to pass any user credentials. This is not my idea obviously, I’ve done this thanks to this post from Stefan Stranger’s blog.

Now, in the deploy stages you need to retrieve your variable’s value in the following way:

MSDyn365 & Azure DevOps ALM 27

Don’t forget the ( ) or it won’t work!

And with these small changes you can have a release like this:

MSDyn365 & Azure DevOps ALM 28

With a single DP upload to LCS and multiple deployments using the file uploaded in the first stage. With approvals, and delays, and emails, and everything!

And now the bad news

The bad news are that, right now, we can’t automate the deployments in self-service environments. We can’t either do this on a production environment, where we must do this manually.

LCS DB API

Call the LCS Database Movement API from your Azure DevOps Pipelines

What for?

Basically, automation. Right now the API only allows the refresh from one Microsoft Dynamics 365 for Finance and Operations environment to another, so the idea is having fresh data from production in our UAT environments daily. I don’t know which new operations the API will support in the future but another idea could be adding the DB export operation (creating a bacpac) to the pipeline and having a copy of prod ready to be restored in a Dev environment.

Don’t forget that the API has a limit of 3 refresh operations per environment per 24 hours. Don’t do this on a CI build! (it makes no sense either). Probably the best idea is to run this nightly with all your tests, once a day.

Calling the API

I’ll use PowerShell to call the API from a pipeline. PowerShell has a command called Invoke-RestMethod that makes HTTP/HTTPS requests. It’s really easy and we just need to do the same we did to call the API in my post.

Getting the token

To get the token we’ll use this script. Just change the variables for the ones of your project, AAD App registration, user (remember it needs access to the preview) and password and run it. If everything is OK you’ll get the JSON response in the $tokenResponse variable and from there you can get the token’s value using dot notation.

Requesting the DB refresh

This will be the call to trigger the refresh. We’ll need the token we’ve just obtained in the first step to use it in the header and the source and target environment Ids.

If it’s successful the response will be a 200 OK.

Add it to your pipeline

Adding this to an Azure DevOps pipeline is no mistery. Select and edit your pipeline, I’m doing it on a nigthly build (it’s called continuous but it’s not…) that runs after the environment has been updated with code, and add a new PowerShell task:

MSDyn365 & Azure DevOps ALM 29

Select the task and change it to “Inline”:

MSDyn365 & Azure DevOps ALM 30

Then just paste the script we’ve created in the Script field and done! You’ll get a refresh after the tests!

You can also run this on your release pipeline BUT if you do it after the deploy step remember to mark the “Wait for Completion” option or the operation will fail because the environment will already be servicing! And even then it could fail if the servicing goes over the timeout time. So… don’t run this on your release pipeline!

And that’s all. Let’s which new operations will be added to the API and what we can do with them.

Use d365fo.tools in your Azure Pipeline

Thanks to Mötz’s comment pointing me to how to add d365fo.tools to a hosted pipeline I’ve created a pipeline which will install the tools and run the commands. It’s even easier to do than with the Invoke-RestMethod.

But first…

Make sure that in your Azure Active Directory app registration you’ve selected “Treat application as a public client” under Authentication:

MSDyn365 & Azure DevOps ALM 31

The task

First we need to install d365fo.tools and then we can use its commands to call the LCS API:

As you can see it a bit easier to do the refresh using d365fo.tools. We get the token and pipeline the output to the Set-D365LcsApiConfig command which will store the token (and others). This also helps to not having to duplicate AppIds, users, etc. and as you can see to call the refresh operation we just need the source and target environment Ids!

Automating Prod to Dev DB copies

The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.

Using the LCS DB API
Using the LCS DB API

The bacpac issue

One of the main setbacks we currently have with prod DB refreshes is that it’s not a quick thing to do because you need to:

  • Refresh a Tier 2+ environment with prod’s DB
  • Export a bacpac from the Tier 2+ environment
  • Restore the bacpac on a Tier 1 VM.

This happens because Tier 2+ environments use Azure SQL as the DB engine and Tier 1 VMs use SQL Server.

The time it takes to complete the process depends on the size of the database and the performance of the VM you’ll restore it to. But it’s not a fast process at all. For a 60GB database you’ll get a bacpac around 7GB that will take:

  • 1 to 2 hours to refresh to UAT
  • 2 to 4 hours for the bacpac to be exported
  • At least 4 hours to restore it to a Tier 1 VM.

That’s between 7 and 11 hours until you have the DB on a developer machine. Once it’s there you can quickly get a BAK and share it. But you might need the time of a full working day to have that data available.

Save us LCS DB API!

Thanks to the new LCS DB API’s endpoint we can perform all these steps automatically, and with the help of d365fo.tools it’ll be even easier. But first…

Due to the extensive time it takes to complete all the process, we first have to decide a schedule (daily, weekly, etc.) and then this schedule must be compatible with the release cadence to UAT/Prod, because only one operation at a time can be done.

There’s still another problem but I’ll talk about it after seeing the scripts.

My proposal

To do the last part of the LCS DB API flow from prod to dev, we need a Tier 1 VM where the bacpac will be restored. My idea is using the build VM on Microsoft’s subscription and an Azure DevOps pipeline to run all the scripts that will restore the DB in that VM. It’s an underused machine and it fits perfectly to this purpose.

I want to clarify why I’ve thought about doing this using the build VM. In most cases this VM will be doing nothing during the night, maybe only running some tests, and it’s during that period of time when I suggest doing all this. But be aware that depending on your DB size this won’t be possible or you’ll run out of space after 2 o 3 restores.

So think about deploying an extra VM and install an agent there to do this, whatever you do don’t mess with the build VM if you don’t know what you’re doing! Try this on a dev VM or anywhere else if you’re afraid of breaking something. Remember you’ll lose the capacity to generate DPs and run pipelines if this environments breaks!

This post is just an example of a possible solution, you need to decide what suits you best! End of the update.

As I said before I’ll be using Mötz Jensen‘s d365fo.tools, we could do everything without them but that would be a bit stupid because using the tools is easier, faster and makes everything clearer.

I’ve separated all the steps in 3 Powershell scripts: execute the refresh, export the bacpac and restore the bacpac.

Refresh database

This will refresh the prod environmnet to a Tier 2+:

Export database

This part will trigger the bacpac export from the Tier 2+ environment which we’ve just refreshed:

Restore bacpac

And the final step will download the bacpac and restore it to a new database:

Using it in an Azure DevOps pipeline

Azure DevOps pipeline
Azure DevOps pipeline

This is it. Create a Powershell script, place it in the Build VM and call it in your pipeline. This is only valid for the agent hosted in the build VM. Everything can probably be run in an Azure hosted agent, but I’ll not cover it here because I think that using the build VM, where we can restore the DB, is more useful to us.

Timing

These 3 scripts will call the LCS DB API to refresh, export and restore the DB. But there’s the timing issue.

Refreshing the database takes some time and exporting it too. You need to find a way to control the status of the operations. The LCS DB API offers an operation you can use to get the status of the ongoing operation. Using d365fo.tools:

You can choose to control that inside your Powershell scripts, but if we use the agent on the build VM that means we cannot use it for anything else until everything is done.

That’s why I separated the process in 3 steps. You can manually schedule 3 pipelines, one for each step at the times you know each stage ends. Then you can choose the order: export, restore, refresh or refresh, export, restore.

You could also use Windows Task Scheduler and forget about AZDO Pipelines, but we’re not doing that because we love pipelines.

And that’s all, we finally have a way of moving data without having to do it manually, we can schedule it, but we need to take some decisions on how we’ll do things. And I’ll leave that up to you 🙂

Secure your Azure Pipelines with Azure Key Vault

But creating a pipeline with a password in plain sight was not very secure. How could we add extra security to a pipeline? Once again we can turn to an Azure tool to help us, the Azure Key Vault.

Azure Key Vault

A Key Vault is a service that allows us to safely store certificates or secrets and later use them in our applications and services. And like many other Azure services it has a cost but it’s really low and, for a normal use, you will be billed like a cent or none a month. Don’t be stingy with security!

You might already know about Azure Key Vault because we can use it in Microsoft Dynamics 365 for Finance and Operations under System Administration. For example it’s how the company certificates for the Spanish SII or Brazilian NF-e are stored and later retrieved to call the web services.

Securing your Azure DevOps Pipelines

Thanks to the Azure Key Vault task (which is open source like many other tasks) getting a secret from a Key Vault has no secret (badum tssss).

Create a Key Vault

Go to your Azure subscription and look for Key Vaults in the top search bar. If you don’t have an Azure subscription you can get one free with a credit of 170€/200$ for 30 days and try this or other things.

In the Key Vault page click on “Create key vault” and fill the fieldsMSDyn365 & Azure DevOps ALM 32

You can go through other tabs but I will just click “Review & Create” to create the vault.

Add the task to DevOps

Now go to Azure DevOps and create a new pipeline or edit an existing one. Add a task to the agent job and look for azure key vault:

MSDyn365 & Azure DevOps ALM 33

It’s possible that you might need to get the task from the marketplace first, if so remember you need to have enough right on the organization and not only the AZDO project you’re in. Now go to the task and select your subscription:

MSDyn365 & Azure DevOps ALM 34

Once selected click the “Authorize” button. This will create a service principal in your subscription, we’ll use it later. After authorizing you just need to select the key vault you’ve created in the first step. And back to Azure.

Setup and secret creation

Go to your key vault, “Access policies” and click “Add Access Policy”:

MSDyn365 & Azure DevOps ALM 35

When we authorized the task to access our Azure subscription it created a service principal now we need to select it to list and get the secrets to be able to use them in our pipeline. Click on “Select principal”:

MSDyn365 & Azure DevOps ALM 36

In the search bar type your subscription’s name, the principal should start with it and end with the same ID of your subscription. Select it and click the “Select” button at the bottom:

MSDyn365 & Azure DevOps ALM 37

Now click on the “Secret permissions” lookup and under “Secret Management Operations” select Get and List:

MSDyn365 & Azure DevOps ALM 38

If you want to also use certificates or keys you should do the same. Finally click the “Add” button and don’t forget to click “Save”!! Otherwise nothing will be saved:

MSDyn365 & Azure DevOps ALM 39

Now we can create a secret in the key vault. Go to secrets and click on “Generate/Import”, complete the fields and finally click on the “Create” button:

MSDyn365 & Azure DevOps ALM 40

Using the secrets in your pipelines

We’re ready to use the secret in our pipeline. I will add a PowerShell task to call the LCS DB API using d365fo.tools but I’ll change all the variables to the secrets:

As you can see now even the AAD App Id is masked.

What the Azure Key Vault task does is getting the secrets from Azure and storing them in variables when the pipeline runs:

MSDyn365 & Azure DevOps ALM 41

Then we can access it’s value with the $(variableName) notation in the PowerShell script. If you try to print the secrets’ values using the Write-Host command all you’ll get will be three asterisks, so you can see that using the Key Vault is more than safe. If we check the result of running the Get-D365LcsDatabaseBackups command we’ll see how good is this:

MSDyn365 & Azure DevOps ALM 42

The ProjectId value is not printed because it was one of our secret values!

And this is how you can add extra security to your Dev ALM!