ariste.info
  • MSDyn365 & Azure DevOps ALM
  • Get my book!
  • ISV License Generator
  • About me
    • About me
    • Subscribe
    • Contact
  • Logo
  • LinkedIn

Unified experience ALM

  • Welcome to the new ALM guide
    • Introduction
    • Prerequisites
  • Create and prepare Azure DevOps organization and project
    • Create an Azure DevOps organization
    • Create an Azure DevOps project
    • Enable TFVC projects in Azure DevOps
    • Add CI/CD agents with an Azure subscription
  • Unified development environment
    • What are unified developer environments?
    • Transition to a capacity-based model
    • Deploying a unified developer environment using PowerShell
    • Deploy an UDE environment from PPAC
    • Upgrade version in a Unified Development Environment
    • Useful links
  • Using Visual Studio
    • Connect Visual Studio to your UDE
    • Deploy code to a Unified Development Environment
  • Pipelines
    • What’s new in the pipelines?

Legacy Guide

  • Welcome
    • Welcome!
  • Dynamics 365 for Finance & Operations & Azure DevOps
    • Intro
    • Package and model planning
    • Azure DevOps
    • First steps
    • The build server
    • Visual Studio
    • Some advice
    • Branching strategies
  • Azure Pipelines
    • Builds
    • Continuous integration
    • Gated check-ins
    • Set up the new Azure DevOps tasks for Packaging and Model Versioning
  • Azure hosted build for Dynamics 365 Finance & SCM
    • Intro
    • Azure agents
    • How does it work?
    • What do I need?
    • Azure DevOps artifacts
    • Prepare Azure DevOps
    • Configure pipeline
    • Update for version 10.0.18 and greater
  • Azure DevTest Labs powered builds
    • Intro
    • Azure DevTest Labs
    • Getting and preparing the VHD
    • Create a DevTest Labs account
    • Creating the VM
    • Preparing the VM
    • Create a new image
    • Azure DevOps pipelines
    • Run the build
    • Times
    • Show me the money
    • Some final remarks
  • Add and build .NET projects
    • Intro
    • Build .NET in your pipeline
    • Add a C# project to FnO
    • Build pipelinebui
    • Things I don’t like/understand/need to investigate
  • Release Pipelines
    • Intro
    • Setting up Release Pipeline in Azure DevOps for Dynamics 365 for Finance and Operations
    • AAD app creation
    • Create the release pipeline in DevOps
    • Creating the LCS connection
    • New Azure DevOps release tasks: MSAL authentication and ADAL deprecation
  • Automation
    • Update VMs using pipelines and d365fo.tools
    • Builds
    • Releases
    • But I like to add some human touch to it
    • Extra bonus
    • Update a variable in a release
  • LCS DB API
    • Call the LCS Database Movement API from your Azure DevOps Pipelines
    • Automating Prod to Dev DB copies
    • Restore a data package with Azure DevOps Pipelines
  • Secure your Azure Pipelines with Azure Key Vault
    • Azure Key Vault
    • Securing your Azure DevOps Pipelines
View Categories
  • ariste.info
  • Dynamics 365 F&O Dev ALM guide
  • Legacy Guide
  • LCS DB API
  • Restore a data package with Azure DevOps Pipelines

Restore a data package with Azure DevOps Pipelines

Moving data from the production to a sandbox environment is something we regularly have to do to have real updated data to do some testing or debugging. It’s a process that takes time and that can be automated as I explained in the post LCS DB API: automating Prod to Dev DB copies.

Me pressing the auto button to automagically import a data package
Me pressing the auto button to automagically import a data package

In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.

You can learn more about the DMF REST API, which I’ll use, reading this post from Fabio Filardi: Dynamics 365 FinOps: Batch import automation with Azure Functions, Business Events and PowerBI.

How will we do it? #

The idea is the following:

  • We’re using the LCS DB API to refresh a sandbox environment with data from production.
  • We’ve previously exported the data package we will restore.
  • We need an Azure blob storage account to host the data package we’ve exported.
  • Furthermore, we will use the Data management package REST API to get the Dynamics 365 blob URL and OData actions to import the data package.

I’ll be skipping the DB restore part because I already explained how to do it, that’s the base for this post.

The flow will be:

  1. Get a SAS URL for our saved data package from our Azure blob.
  2. Get the SAS URL for the blob of our Dynamics 365 target environment using the GetAzureWriteUrl OData action.
  3. Copy our data package from the source blob to the target blob.
  4. Execute the import with the ImportFromPackage action.

Create an Export project #

This is the first thing we need to do in our source environment where we want to get our data from. We have to create an export project that contains all the entities that we will restore in our target environment. Head to the Data management workspace and then press the Export tile:

Data management workspace: Export
Data management workspace: Export

Make sure you check the Generate data package checkbox in the export project and then add all your entities that will be uploaded and imported later:

DMF Export project with Generate data package enabled
DMF Export project with Generate data package enabled

Run the job and when it’s done download the data package.

Create an Azure Blob and upload the Data package #

Create an Azure blob storage account, first the storage account, then the blob in it, and upload the data package we’ve exported there.

Create an Import project #

The next step will be creating an import project in our target environment to import our DP and also reference it in our pipeline. Select the DP we’ve exported in the first step and add it:

Import project: add the data package
Import project: add the data package

Azure DevOps pipeline #

And the last step is to create a new pipeline or add additional steps to the pipeline that will run your DB refresh.

Update: let’s forget about uploading azcopy.exe to our repo, we’ll be using d365fo.tools.

What I’ve done to start is upload azcopy.exe to my code repository:

Added Azcopy.exe to TFVC
Added Azcopy.exe to TFVC

Why? Because I’ve tried using the Blob service REST API and couldn’t manage to copy from the source blob to the target blob, so I chose the easy way (for me).

Then, my pipeline consists of two tasks:

Pipeline
Pipeline

In the Get sources section I’ve just mapped the Tools folder, like this:

Tools folder mapped as the $(build.sourcesDirectory) root
Tools folder mapped as the $(build.sourcesDirectory) root

And regarding the tasks, both are PowerShell tasks, and in the first one, I just install the Azure PowerShell module and d365fo.tools, and then we’ll install azcopy in the c:\temp folder. I could do everything in one task, this is only a matter of organization. This first task has this script:

Install-Module -Name AZ -AllowClobber -Scope CurrentUser -Force -Confirm:$False -SkipPublisherCheck

Install-Module -Name d365fo.tools -AllowClobber -Scope CurrentUser -Force -Confirm:$false

Invoke-D365InstallAzCopy -Path "C:\temp\AzCopy.exe"

So each time the Microsoft-hosted pipeline runs, it will install the Azure PowerShell module.

And in the second step is where everything else is happening, this is the full script which I’ll comment on later:

# Getting settings
$file = "YOUR_EXPORTED_DP.zip"
$saname = "YOUR_STORTAGE_ACCOUNT_NAME"
$containername = "CONTAINER_NAME"
$key = "ACCESS_KEY"
$environmentUrl = "Dynamics365FNO_ENVIRONMENT_URL"

# Get source blob URL
$ctx = New-AzStorageContext -StorageAccountName $saname -StorageAccountKey $key
$blob = Get-AzStorageBlob -Blob $file -Container $containername -Context $ctx -ErrorAction Stop
$StartTime = Get-Date
$EndTime = $startTime.AddMinutes(2.0)

$sourceUrl = New-AzStorageBlobSASToken -Container $containername -Blob $file -Permission r -StartTime $StartTime -ExpiryTime $EndTime -FullUri -Context $ctx

# GET BLOB DEST URL
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("Accept", "application/json")

$body = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$body.Add("tenant_id", "yourtenant.com")
$body.Add("client_id", "AZURE_APP_ID")
$body.Add("client_secret", "SECRET")
$body.Add("grant_type", "client_credentials")
$body.Add("resource", $environmentUrl)

$response = Invoke-RestMethod 'https://login.microsoftonline.com/yourtenant.com/oauth2/token' -Method 'POST' -Headers $headers -Body $body
$response | ConvertTo-Json

#UPLOAD/copy from source to dest
$headersDest = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headersDest.Add("Content-Type", "application/json")
$tokenAuth = "Bearer " + $response.access_token
$headersDest.Add("Authorization", $tokenAuth)

$currDate = Get-Date -Format "yyyyMMdd_HHmmss"
$uploadId = "restoreDP" + $currDate + ".zip"
$bodyDest = '{"uniqueFileName": "' + $uploadId + '"}'
$getAzureUrl = $environmentUrl + "https://static.ariste.info/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl"

$responseDest = Invoke-RestMethod $getAzureUrl -Method 'POST' -Headers $headersDest -Body $bodyDest
$responseDest | ConvertTo-Json
$objUrl = $responseDest.value | ConvertFrom-Json
$destinationUrl = $objUrl.BlobUrl


#COPY TO TARGET
$fileNameCopy = "c:\temp\" + $uploadId + ".zip"

c:\temp\azcopy.exe copy $sourceUrl $fileNameCopy --recursive

c:\temp\azcopy.exe copy $fileNameCopy $destinationUrl --recursive

# EXECUTE IMPORT
$headersImport = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headersImport.Add("Content-Type", "application/json")
$headersImport.Add("Authorization", $tokenAuth)

$bodyImport = "{
`n    `"packageUrl`": `"" + $destinationUrl + "`",
`n    `"definitionGroupId`": `"ImportDP`",
`n    `"executionId`": `"`",
`n    `"execute`": true,
`n    `"overwrite`": true,
`n    `"legalEntityId`": `"USMF`"
`n}"

$executeImportUrl = $environmentUrl + "https://static.ariste.info/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage"

$responseImport = Invoke-RestMethod $executeImportUrl -Method 'POST' -Headers $headersImport -Body $bodyImport

Step by step #

The first block corresponds to the settings of your blob storage account and Dynamics 365 Finance and Operations environment:

# Getting settings
$file = "YOUR_EXPORTED_DP.zip"
$saname = "YOUR_STORTAGE_ACCOUNT_NAME"
$containername = "CONTAINER_NAME"
$key = "ACCESS_KEY"
$environmentUrl = "Dynamics365FNO_ENVIRONMENT_URL"

Next, we get the SAS URL for the source, using the Azure PowerShell module, which is the storage account we’ve created earlier. To make sure we don’t mess up with the SAS token, we’re only giving it 2 minutes of validity:

# Get source blob URL
$ctx = New-AzStorageContext -StorageAccountName $saname -StorageAccountKey $key
$blob = Get-AzStorageBlob -Blob $file -Container $containername -Context $ctx -ErrorAction Stop
$StartTime = Get-Date
$EndTime = $startTime.AddMinutes(2.0)

$sourceUrl = New-AzStorageBlobSASToken -Container $containername -Blob $file -Permission r -StartTime $StartTime -ExpiryTime $EndTime -FullUri -Context $ctx

We have the source URL, now we will get the destination URL for our MSDyn365FO blob storage account using the OData action GetAzureWriteUrl. We request an authentication token to do the call and generate a filename using the current date and time to make sure the filename is unique:

# GET BLOB DEST URL
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Content-Type", "application/x-www-form-urlencoded")
$headers.Add("Accept", "application/json")

$body = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$body.Add("tenant_id", "yourtenant.com")
$body.Add("client_id", "AZURE_APP_ID")
$body.Add("client_secret", "SECRET")
$body.Add("grant_type", "client_credentials")
$body.Add("resource", $environmentUrl)

$response = Invoke-RestMethod 'https://login.microsoftonline.com/yourtenant.com/oauth2/token' -Method 'POST' -Headers $headers -Body $body
$response | ConvertTo-Json

#UPLOAD/copy from source to dest
$headersDest = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headersDest.Add("Content-Type", "application/json")
$tokenAuth = "Bearer " + $response.access_token
$headersDest.Add("Authorization", $tokenAuth)

$currDate = Get-Date -Format "yyyyMMdd_HHmmss"
$uploadId = "restoreDP" + $currDate + ".zip"
$bodyDest = '{"uniqueFileName": "' + $uploadId + '"}'
$getAzureUrl = $environmentUrl + "https://static.ariste.info/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.GetAzureWriteUrl"

$responseDest = Invoke-RestMethod $getAzureUrl -Method 'POST' -Headers $headersDest -Body $bodyDest
$responseDest | ConvertTo-Json
$objUrl = $responseDest.value | ConvertFrom-Json
$destinationUrl = $objUrl.BlobUrl

We have both SAS URLs, so we’ll use azcopy to get the data package we exported to restore and copy it to the target environment where we’ll restore it:

#COPY TO TARGET
$fileNameCopy = "c:\temp\" + $uploadId + ".zip"

$(build.sourcesDirectory)\azcopy.exe copy $sourceUrl $fileNameCopy --recursive

$(build.sourcesDirectory)\azcopy.exe copy $fileNameCopy $destinationUrl --recursive

And, finally, we have to trigger the import using the ImportFromPackage action with the parameters for our FnO environment in the body:

# EXECUTE IMPORT
$headersImport = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headersImport.Add("Content-Type", "application/json")
$headersImport.Add("Authorization", $tokenAuth)

$bodyImport = "{
`n `"packageUrl`": `"" + $destinationUrl + "`",
`n `"definitionGroupId`": `"ImportDP`",
`n `"executionId`": `"`",
`n `"execute`": true,
`n `"overwrite`": true,
`n `"legalEntityId`": `"USMF`"
`n}"

$executeImportUrl = $environmentUrl + "https://static.ariste.info/data/DataManagementDefinitionGroups/Microsoft.Dynamics.DataEntities.ImportFromPackage"

$responseImport = Invoke-RestMethod $executeImportUrl -Method 'POST' -Headers $headersImport -Body $bodyImport

When the last REST call is done we can go to our Data management workspace in Dynamics 365 and see the job is there:

Data management import job
Data management import job

And done! Now we can refresh an environment with data from production and get it ready to use after changing all the parameters we need or even enabling some users.

Subscribe! #

Receive an email when a new post is published
What are your Feelings
Share This Article :
  • Facebook
  • X
  • LinkedIn
  • Pinterest
Still stuck? How can I help?

How can I help?

Automating Prod to Dev DB copies

Write A Comment Cancel Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Table of Contents
  • How will we do it?
    • Create an Export project
    • Create an Azure Blob and upload the Data package
    • Create an Import project
    • Azure DevOps pipeline
    • Step by step
  • Dynamics 365 Community
  • Subscribe!
  • Microsoft MVP
  • LinkedIn
  • Privacy Policy
  • Contact
  • About me

© 2024 ariste.info. Designed by CazaPelusas.

Top

    Type above and press Enter to search. Press Esc to cancel.