You can read my complete ALM guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

Moving data from the production to a sandbox environment is something we regularly have to do to have real updated data to do some testing or debugging. It’s a process that takes time and that can be automated as I explained in the post LCS DB API: automating Prod to Dev DB copies.

Me pressing the auto button to automagically import a data package
Me pressing the auto button to automagically import a data package

In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.

You can learn more about the DMF REST API, which I’ll use, reading this post from Fabio Filardi: Dynamics 365 FinOps: Batch import automation with Azure Functions, Business Events and PowerBI.

You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:

How will we do it?

The idea is the following:

  • We’re using the LCS DB API to refresh a sandbox environment with data from production.
  • We’ve previously exported the data package we will restore.
  • We need an Azure blob storage account to host the data package we’ve exported.
  • Furthermore, we will use the Data management package REST API to get the Dynamics 365 blob URL and OData actions to import the data package.

I’ll be skipping the DB restore part because I already explained how to do it, that’s the base for this post.

The flow will be:

  1. Get a SAS URL for our saved data package from our Azure blob.
  2. Get the SAS URL for the blob of our Dynamics 365 target environment using the GetAzureWriteUrl OData action.
  3. Copy our data package from the source blob to the target blob.
  4. Execute the import with the ImportFromPackage action.

Create an Export project

This is the first thing we need to do in our source environment where we want to get our data from. We have to create an export project that contains all the entities that we will restore in our target environment. Head to the Data management workspace and then press the Export tile:

Data management workspace: Export
Data management workspace: Export

Make sure you check the Generate data package checkbox in the export project and then add all your entities that will be uploaded and imported later:

DMF Export project with Generate data package enabled
DMF Export project with Generate data package enabled

Run the job and when it’s done download the data package.

Create an Azure Blob and upload the Data package

Create an Azure blob storage account, first the storage account, then the blob in it, and upload the data package we’ve exported there.

Create an Import project

The next step will be creating an import project in our target environment to import our DP and also reference it in our pipeline. Select the DP we’ve exported in the first step and add it:

Import project: add the data package
Import project: add the data package

Azure DevOps pipeline

And the last step is to create a new pipeline or add additional steps to the pipeline that will run your DB refresh.

Update: let’s forget about uploading azcopy.exe to our repo, we’ll be using

What I’ve done to start is upload azcopy.exe to my code repository:

Added Azcopy.exe to TFVC
Added Azcopy.exe to TFVC

Why? Because I’ve tried using the Blob service REST API and couldn’t manage to copy from the source blob to the target blob, so I chose the easy way (for me).

Then, my pipeline consists of two tasks:


In the Get sources section I’ve just mapped the Tools folder, like this:

Tools folder mapped as the $(build.sourcesDirectory) root
Tools folder mapped as the $(build.sourcesDirectory) root

And regarding the tasks, both are PowerShell tasks, and in the first one, I just install the Azure PowerShell module and, and then we’ll install azcopy in the c:\temp folder. I could do everything in one task, this is only a matter of organization. This first task has this script:

So each time the Microsoft-hosted pipeline runs, it will install the Azure PowerShell module.

And in the second step is where everything else is happening, this is the full script which I’ll comment on later:

Step by step

The first block corresponds to the settings of your blob storage account and Dynamics 365 Finance and Operations environment:

Next, we get the SAS URL for the source, using the Azure PowerShell module, which is the storage account we’ve created earlier. To make sure we don’t mess up with the SAS token, we’re only giving it 2 minutes of validity:

We have the source URL, now we will get the destination URL for our MSDyn365FO blob storage account using the OData action GetAzureWriteUrl. We request an authentication token to do the call and generate a filename using the current date and time to make sure the filename is unique:

We have both SAS URLs, so we’ll use azcopy to get the data package we exported to restore and copy it to the target environment where we’ll restore it:

And, finally, we have to trigger the import using the ImportFromPackage action with the parameters for our FnO environment in the body:

When the last REST call is done we can go to our Data management workspace in Dynamics 365 and see the job is there:

Data management import job
Data management import job

And done! Now we can refresh an environment with data from production and get it ready to use after changing all the parameters we need or even enabling some users.

Final remarks

As usual, there’s been a lot of trial and error while testing this, and I’m sure that the script can be enhanced and do some things in a different and better manner.

Also, I’m not very skilled at PowerShell, and there might be best practices, like using try-catch blocks or controlling the output and results of each of the operations that could have been implemented, but I just wanted to show that this process is possible and can be done.


Receive an email when a new post is published

Write A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.