The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.
You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:
You can also read the full guide on MSDyn365FO & Azure DevOps ALM.
And remember: this is currently in private preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited to the Yammer organization you can ask to join the “Self-Service Database Movement / DataALM” group where you’ll get the information to add yourself to the preview and enable it on LCS.
The bacpac issue
One of the main setbacks we currently have with prod DB refreshes is that it’s not a quick thing to do because you need to:
- Refresh a Tier 2+ environment with prod’s DB
- Export a bacpac from the Tier 2+ environment
- Restore the bacpac on a Tier 1 VM.
This happens because Tier 2+ environments use Azure SQL as the DB engine and Tier 1 VMs use SQL Server.
The time it takes to complete the process depends on the size of the database and the performance of the VM you’ll restore it to. But it’s not a fast process at all. For a 60GB database you’ll get a bacpac around 7GB that will take:
- 1 to 2 hours to refresh to UAT
- 2 to 4 hours for the bacpac to be exported
- At least 4 hours to restore it to a Tier 1 VM.
That’s between 7 and 11 hours until you have the DB on a developer machine. Once it’s there you can quickly get a BAK and share it. But you might need the time of a full working day to have that data available.
Save us LCS DB API!
Thanks to the new LCS DB API’s endpoint we can perform all these steps automatically, and with the help of d365fo.tools it’ll be even easier. But first…
Due to the extensive time it takes to complete all the process, we first have to decide a schedule (daily, weekly, etc.) and then this schedule must be compatible with the release cadence to UAT/Prod, because only one operation at a time can be done.
There’s still another problem but I’ll talk about it after seeing the scripts.
My proposal
To do the last part of the LCS DB API flow from prod to dev, we need a Tier 1 VM where the bacpac will be restored. My idea is using the build VM on Microsoft’s subscription and an Azure DevOps pipeline to run all the scripts that will restore the DB in that VM. It’s an underused machine and it fits perfectly to this purpose.
UPDATE: after this comment on Twitter by Volker Deuss I want to clarify why I’ve thought about doing this using the build VM. In most cases this VM will be doing nothing during the night, maybe only running some tests, and it’s during that period of time when I suggest doing all this. But be aware that depending on your DB size this won’t be possible or you’ll run out of space after 2 o 3 restores.
So think about deploying an extra VM and install an agent there to do this, whatever you do don’t mess with the build VM if you don’t know what you’re doing! Try this on a dev VM or anywhere else if you’re afraid of breaking something. Remember you’ll lose the capacity to generate DPs and run pipelines if this environments breaks!
This post is just an example of a possible solution, you need to decide what suits you best! End of the update.
As I said before I’ll be using Mötz Jensen‘s d365fo.tools, we could do everything without them but that would be a bit stupid because using the tools is easier, faster and makes everything clearer.
I’ve separated all the steps in 3 Powershell scripts: execute the refresh, export the bacpac and restore the bacpac.
Refresh database
This will refresh the prod environmnet to a Tier 2+:
$clientId = "ab12345-6220-4566-896a-19a4ad41783f"
$userName = "admin@tenant"
$passWord = "admin123456"
$projectId = "1234567"
$sourceEnvId = "958bc863-f089-4811-abbd-c1190917eaae"
$targetEnvId = "13aa6872-c13b-4ea3-81cd-2d26fa72ec5e"
Get-D365LcsApiToken -ClientId $clientId -Username $userName -Password $passWord -LcsApiUri "https://lcsapi.lcs.dynamics.com" -Verbose | Set-D365LcsApiConfig -ProjectId $projectId
Invoke-D365LcsDatabaseRefresh -SourceEnvironmentId $sourceEnvId -TargetEnvironmentId $targetEnvId -SkipInitialStatusFetch
Export database
This part will trigger the bacpac export from the Tier 2+ environment which we’ve just refreshed:
$sourceEnvId = "958bc863-f089-4811-abbd-c1190917eaae"
$targetEnvId = "13aa6872-c13b-4ea3-81cd-2d26fa72ec5e"
Get-D365LcsApiConfig | Invoke-D365LcsApiRefreshToken | Set-D365LcsApiConfig
Invoke-D365LcsDatabaseExport -SourceEnvironmentId $targetEnvId -BackupName $bacpacName
Restore bacpac
And the final step will download the bacpac and restore it to a new database:
$currentDate = Get-Date -Format yyyymmdd
$bacpacName = "UAT{0}" -f $currentDate
$downloadPath = "D:\UAT{0}.bacpac" -f $currentDate
$newDBName = "AxDB_{0}" -f $currentDate
Get-D365LcsApiConfig | Invoke-D365LcsApiRefreshToken | Set-D365LcsApiConfig
$backups = Get-D365LcsDatabaseBackups
$fileLocation = $backups[0].FileLocation
Invoke-D365AzCopyTransfer -SourceUri $fileLocation -DestinationUri $downloadPath
Import-D365Bacpac -ImportModeTier1 -BacpacFile $downloadPath -NewDatabaseName $newDBName
Using it in an Azure DevOps pipeline
This is it. Create a Powershell script, place it in the Build VM and call it in your pipeline. This is only valid for the agent hosted in the build VM. Everything can probably be run in an Azure hosted agent, but I’ll not cover it here because I think that using the build VM, where we can restore the DB, is more useful to us.
Timing
These 3 scripts will call the LCS DB API to refresh, export and restore the DB. But there’s the timing issue.
Refreshing the database takes some time and exporting it too. You need to find a way to control the status of the operations. The LCS DB API offers an operation you can use to get the status of the ongoing operation. Using d365fo.tools:
Get-D365LcsDatabaseRefreshStatus -OperationActivityId 123456789 -EnvironmentId "99ac6587-c13b-4ea3-81cd-2d26fa72ec5e"
You can choose to control that inside your Powershell scripts, but if we use the agent on the build VM that means we cannot use it for anything else until everything is done.
That’s why I separated the process in 3 steps. You can manually schedule 3 pipelines, one for each step at the times you know each stage ends. Then you can choose the order: export, restore, refresh or refresh, export, restore.
You could also use Windows Task Scheduler and forget about AZDO Pipelines, but we’re not doing that because we love pipelines.
And that’s all, we finally have a way of moving data without having to do it manually, we can schedule it, but we need to take some decisions on how we’ll do things. And I’ll leave that up to you 🙂
24 Comments
Hi Sir,
I need your help we need to restore one bacpac file to a tier 1 environment the bacpac is on my local machine but didn’t find any thing to describe tier1 environment on our code , Can you pls help us and guide how we can achieve the same however we tried to follow your restore bacpac script we found an error and unfortunately we couldn’t execute it.
Thanks in advance.
Hi Syed,
if you only want to import a bacpac into a dev environment check this link: https://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/database/dbmovement-scenario-exportuat#import-the-database