ariste.info
  • MSDyn365 & Azure DevOps ALM
  • Get my book!
  • ISV License Generator
  • About me
    • About me
    • Subscribe
    • Contact
  • Logo
  • LinkedIn

Unified experience ALM

  • Welcome to the new ALM guide
    • Introduction
    • Prerequisites
  • Create and prepare Azure DevOps organization and project
    • Create an Azure DevOps organization
    • Create an Azure DevOps project
    • Enable TFVC projects in Azure DevOps
    • Add CI/CD agents with an Azure subscription
  • Unified development environment
    • What are unified developer environments?
    • Transition to a capacity-based model
    • Deploying a unified developer environment using PowerShell
    • Deploy an UDE environment from PPAC
    • Upgrade version in a Unified Development Environment
    • Useful links
  • Using Visual Studio
    • Connect Visual Studio to your UDE
    • Deploy code to a Unified Development Environment
  • Pipelines
    • What’s new in the pipelines?

Legacy Guide

  • Welcome
    • Welcome!
  • Dynamics 365 for Finance & Operations & Azure DevOps
    • Intro
    • Package and model planning
    • Azure DevOps
    • First steps
    • The build server
    • Visual Studio
    • Some advice
    • Branching strategies
  • Azure Pipelines
    • Builds
    • Continuous integration
    • Gated check-ins
    • Set up the new Azure DevOps tasks for Packaging and Model Versioning
  • Azure hosted build for Dynamics 365 Finance & SCM
    • Intro
    • Azure agents
    • How does it work?
    • What do I need?
    • Azure DevOps artifacts
    • Prepare Azure DevOps
    • Configure pipeline
    • Update for version 10.0.18 and greater
  • Azure DevTest Labs powered builds
    • Intro
    • Azure DevTest Labs
    • Getting and preparing the VHD
    • Create a DevTest Labs account
    • Creating the VM
    • Preparing the VM
    • Create a new image
    • Azure DevOps pipelines
    • Run the build
    • Times
    • Show me the money
    • Some final remarks
  • Add and build .NET projects
    • Intro
    • Build .NET in your pipeline
    • Add a C# project to FnO
    • Build pipelinebui
    • Things I don’t like/understand/need to investigate
  • Release Pipelines
    • Intro
    • Setting up Release Pipeline in Azure DevOps for Dynamics 365 for Finance and Operations
    • AAD app creation
    • Create the release pipeline in DevOps
    • Creating the LCS connection
    • New Azure DevOps release tasks: MSAL authentication and ADAL deprecation
  • Automation
    • Update VMs using pipelines and d365fo.tools
    • Builds
    • Releases
    • But I like to add some human touch to it
    • Extra bonus
    • Update a variable in a release
  • LCS DB API
    • Call the LCS Database Movement API from your Azure DevOps Pipelines
    • Automating Prod to Dev DB copies
    • Restore a data package with Azure DevOps Pipelines
  • Secure your Azure Pipelines with Azure Key Vault
    • Azure Key Vault
    • Securing your Azure DevOps Pipelines
View Categories
  • ariste.info
  • Dynamics 365 F&O Dev ALM guide
  • Legacy Guide
  • LCS DB API
  • Automating Prod to Dev DB copies

Automating Prod to Dev DB copies

The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.
Using the LCS DB API
Using the LCS DB API

The bacpac issue #

One of the main setbacks we currently have with prod DB refreshes is that it’s not a quick thing to do because you need to:
  • Refresh a Tier 2+ environment with prod’s DB
  • Export a bacpac from the Tier 2+ environment
  • Restore the bacpac on a Tier 1 VM.
This happens because Tier 2+ environments use Azure SQL as the DB engine and Tier 1 VMs use SQL Server. The time it takes to complete the process depends on the size of the database and the performance of the VM you’ll restore it to. But it’s not a fast process at all. For a 60GB database you’ll get a bacpac around 7GB that will take:
  • 1 to 2 hours to refresh to UAT
  • 2 to 4 hours for the bacpac to be exported
  • At least 4 hours to restore it to a Tier 1 VM.
That’s between 7 and 11 hours until you have the DB on a developer machine. Once it’s there you can quickly get a BAK and share it. But you might need the time of a full working day to have that data available.

Save us LCS DB API! #

Thanks to the new LCS DB API’s endpoint we can perform all these steps automatically, and with the help of d365fo.tools it’ll be even easier. But first… Due to the extensive time it takes to complete all the process, we first have to decide a schedule (daily, weekly, etc.) and then this schedule must be compatible with the release cadence to UAT/Prod, because only one operation at a time can be done. There’s still another problem but I’ll talk about it after seeing the scripts.

My proposal #

To do the last part of the LCS DB API flow from prod to dev, we need a Tier 1 VM where the bacpac will be restored. My idea is using the build VM on Microsoft’s subscription and an Azure DevOps pipeline to run all the scripts that will restore the DB in that VM. It’s an underused machine and it fits perfectly to this purpose. I want to clarify why I’ve thought about doing this using the build VM. In most cases this VM will be doing nothing during the night, maybe only running some tests, and it’s during that period of time when I suggest doing all this. But be aware that depending on your DB size this won’t be possible or you’ll run out of space after 2 o 3 restores. So think about deploying an extra VM and install an agent there to do this, whatever you do don’t mess with the build VM if you don’t know what you’re doing! Try this on a dev VM or anywhere else if you’re afraid of breaking something. Remember you’ll lose the capacity to generate DPs and run pipelines if this environments breaks! This post is just an example of a possible solution, you need to decide what suits you best! End of the update. As I said before I’ll be using Mötz Jensen‘s d365fo.tools, we could do everything without them but that would be a bit stupid because using the tools is easier, faster and makes everything clearer. I’ve separated all the steps in 3 Powershell scripts: execute the refresh, export the bacpac and restore the bacpac.

Refresh database #

This will refresh the prod environmnet to a Tier 2+:
$clientId = "ab12345-6220-4566-896a-19a4ad41783f"
$userName = "admin@tenant"
$passWord = "admin123456"
$projectId = "1234567"
$sourceEnvId = "958bc863-f089-4811-abbd-c1190917eaae"
$targetEnvId = "13aa6872-c13b-4ea3-81cd-2d26fa72ec5e"

Get-D365LcsApiToken -ClientId $clientId -Username $userName -Password $passWord -LcsApiUri "https://lcsapi.lcs.dynamics.com" -Verbose | Set-D365LcsApiConfig -ProjectId $projectId

Invoke-D365LcsDatabaseRefresh -SourceEnvironmentId $sourceEnvId -TargetEnvironmentId $targetEnvId -SkipInitialStatusFetch

Export database #

This part will trigger the bacpac export from the Tier 2+ environment which we’ve just refreshed:
$sourceEnvId = "958bc863-f089-4811-abbd-c1190917eaae"
$targetEnvId = "13aa6872-c13b-4ea3-81cd-2d26fa72ec5e"

Get-D365LcsApiConfig | Invoke-D365LcsApiRefreshToken | Set-D365LcsApiConfig

Invoke-D365LcsDatabaseExport -SourceEnvironmentId $targetEnvId -BackupName $bacpacName

Restore bacpac #

And the final step will download the bacpac and restore it to a new database:
$currentDate = Get-Date -Format yyyymmdd
$bacpacName = "UAT{0}" -f $currentDate
$downloadPath = "D:\UAT{0}.bacpac" -f $currentDate
$newDBName = "AxDB_{0}" -f $currentDate

Get-D365LcsApiConfig | Invoke-D365LcsApiRefreshToken | Set-D365LcsApiConfig

$backups = Get-D365LcsDatabaseBackups

$fileLocation = $backups[0].FileLocation

Invoke-D365AzCopyTransfer -SourceUri $fileLocation -DestinationUri $downloadPath

Import-D365Bacpac -ImportModeTier1 -BacpacFile $downloadPath -NewDatabaseName $newDBName

Using it in an Azure DevOps pipeline #

Azure DevOps pipeline
Azure DevOps pipeline
This is it. Create a Powershell script, place it in the Build VM and call it in your pipeline. This is only valid for the agent hosted in the build VM. Everything can probably be run in an Azure hosted agent, but I’ll not cover it here because I think that using the build VM, where we can restore the DB, is more useful to us.

Timing #

These 3 scripts will call the LCS DB API to refresh, export and restore the DB. But there’s the timing issue. Refreshing the database takes some time and exporting it too. You need to find a way to control the status of the operations. The LCS DB API offers an operation you can use to get the status of the ongoing operation. Using d365fo.tools:
Get-D365LcsDatabaseOperationStatus -OperationActivityId 123456789 -EnvironmentId "99ac6587-c13b-4ea3-81cd-2d26fa72ec5e"
You can choose to control that inside your Powershell scripts, but if we use the agent on the build VM that means we cannot use it for anything else until everything is done. That’s why I separated the process in 3 steps. You can manually schedule 3 pipelines, one for each step at the times you know each stage ends. Then you can choose the order: export, restore, refresh or refresh, export, restore. You could also use Windows Task Scheduler and forget about AZDO Pipelines, but we’re not doing that because we love pipelines. And that’s all, we finally have a way of moving data without having to do it manually, we can schedule it, but we need to take some decisions on how we’ll do things. And I’ll leave that up to you 🙂

Subscribe! #

Receive an email when a new post is published
What are your Feelings
Share This Article :
  • Facebook
  • X
  • LinkedIn
  • Pinterest
Still stuck? How can I help?

How can I help?

Call the LCS Database Movement API from your Azure DevOps PipelinesRestore a data package with Azure DevOps Pipelines

13 Comments

  1. Ankit 3 years ago Reply

    Hi Adria,

    Thank you for such an informative and useful article.
    I currently have multiple Dev/Test Environments, and I was wondering, how does the script decide which of the enviornment it should restore the bacpac database to. I did not see any parameters for target environment for Import-D365Bacpac function. Can you please advise if that is something I can configure in the DevOps pipeline?

    Many Thanks,
    Ankit

    • Adrià Ariste Santacreu Post Author 3 years ago Reply

      Hi Ankit,

      the idea is that the pipeline runs on a self-hosted agent, a Dynamics 365 build VM where the bacpac will be restored to, that’s why the Import-D365Bacpac cmdlet has no target. If you want to restore a DB to different environments the easiest way, at least for me, is to restore the bacpac in a single VM, then get a BAK file from there and restore it to other environments, which is faster than restoring a bacpac. And if you’d like to automate the BAK part too, you’d probably have to configure Azure DevOps agents in each environment to be able to run the pipeline there.

  2. Pavan 2 years ago Reply

    Hi Adrià Ariste Santacreu,
    In the databaseExport script, why do we need to mention the source and target environment IDs? I would like to create a release just for the export operation without source and target environment IDs. Is this achievable?

    Regards,
    Pavan

    • Adrià Ariste Santacreu Post Author 2 years ago Reply

      Hi Pavan,

      it’s not needed if you want to just trigger the export operation, but in the scripts after that the values are used, that’s why I included both in the beginning of the script.

      • Pavan 1 year ago Reply

        Hi Adrià Ariste Santacreu,

        I’m trying to refresh prod data to sandbox I’m getting below error.
        Invoke-D365LcsDatabaseRefresh : Cannot bind argument to parameter ‘LcsApiUri’ because it is an empty string.

        Below is the PS script I’m using.

        Install-PackageProvider nuget -Scope CurrentUser -Force -Confirm:$false
        Install-Module -Name AZ -AllowClobber -Scope CurrentUser -Force -Confirm:$False -SkipPublisherCheck
        Install-Module -Name d365fo.tools -AllowClobber -Scope CurrentUser -Force -Confirm:$false
        $clientId = “”
        $userName = “”
        $passWord = “”
        $projectId = “”
        $sourceEnvId = “”
        $targetEnvId = “”

        Get-D365LcsApiToken -ClientId $clientId -Username $userName -Password $passWord -LcsApiUri “https://lcsapi.lcs.dynamics.com” -Verbose | Set-D365LcsApiConfig -ProjectId $projectId

        Invoke-D365LcsDatabaseRefresh -SourceEnvironmentId $sourceEnvId -TargetEnvironmentId $targetEnvId -SkipInitialStatusFetch

        Could you kindly help with this?

        • Adrià Ariste Santacreu Post Author 1 year ago Reply

          Hi Pavan,

          check if the endpoint is the expected one for your region: https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/deployment/deployment-options-geo#supported-geographies-and-endpoints

          • Pavan 1 year ago

            Hi Adrià Ariste Santacreu,

            The above suggestion worked. Thanks for your support.

            Regards,
            Pavan

  3. Marek 2 years ago Reply

    Hi Adria,

    firstly thanks a lot for your work on these manuals. Massively appriciated.

    I have one question. Where you set/get $bacpacName ?

    I was interested only on export without refreshing environment DB, so I set one for myself. But I not see it in your scripts defined. Am I missing something?

    • Adrià Ariste Santacreu Post Author 2 years ago Reply

      Hi Marek,

      thank you for reading, glad it can help.

      About the $bacpacName… it looks like I’ve missed it somewhere. You can define it anywhere in your script and that should be fine, it’s the name it’ll be saved in LCS’ asset library.

  4. Hari Krishnan 5 months ago Reply

    What if my DB bacpac is more than 50 GB and exceeds limit

    • Adrià Ariste Santacreu Post Author 5 months ago Reply

      Which is that limit?

  5. Thomas 4 months ago Reply

    Adria, thank you so much for these articles. I’ve found that under your timing section it appears Get-D365LcsDatabaseRefreshStatus has been replaced with Get-D365LcsDatabaseOperationStatus in the latest updates. It might be worth making note of this as I was getting errors until researching this further.

    • Adrià Ariste Santacreu Post Author 4 months ago Reply

      Hi Thomas,

      Thank you for letting me know, I’ve updated the article.

Write A Comment Cancel Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Table of Contents
  • The bacpac issue
  • Save us LCS DB API!
  • My proposal
    • Refresh database
    • Export database
    • Restore bacpac
    • Using it in an Azure DevOps pipeline
  • Timing
  • Dynamics 365 Community
  • Subscribe!
  • Microsoft MVP
  • LinkedIn
  • Privacy Policy
  • Contact
  • About me

© 2024 ariste.info. Designed by CazaPelusas.

Top

    Type above and press Enter to search. Press Esc to cancel.