There’s a new version and a new task for our release pipelines that use the Azure-hosted agents. These changes have been introduced recently to support the new MSAL authentication libraries for the LCS service connection used to upload and deploy the deployable packages.
The current service connections use Azure Active Directory (Azure AD) Authentication Library (ADAL), and support for ADAL will end in June 2022.
This means that if we don’t update the Asset Upload and Asset Deployment to their new versions (1.* and 2.* respectively) the release pipelines could stop working after 30th June 2022.
I’m sure each team manages this differently, maybe leaving it to each developer to update their VM, or there’s someone in the customer or partner side that will do it. That’s in the best cases, maybe nobody is updating the developer machines…
If you want to know more about builds, releases, and the Dev ALM of Dynamics 365 you can read my full guide on MSDyn365 & Azure DevOps ALM.
Today, I’m bringing you a PowerShell script that you can run in a pipeline that will automatically update all your developer virtual machines!
Me pressing the auto button to automagically import a data package
In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.
You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:
If you receive the LCS email notifications for your projects you already know this: all Tier 1 virtual machines from Microsoft’s subscription will be gone as early as 1 December!
What do you mean gone?
This is what the emails say:
As communicated previously, Microsoft is removing the use of Remote Desktop Protocol (RDP) to access environments managed by Microsoft. As RDP access is required for development, going forward customers will be required to develop using a Cloud Hosted Environment or download a local “Virtual Hard Disk” (VHD) within Lifecycle Services. Cloud Hosted Environments will allow customers to manage the compute, size, and cost of these environments. This infrastructure change will ensure that customers decouple development tools from any running environment.
In addition, effective November 1, Tier 1 environments will not be included in the purchase of Dynamics 365 Finance, Dynamics 365 Supply Chain Management, Dynamics 365 Project Operations, or Dynamics 365 Commerce apps. The ability to purchase additional Add-On tier 1 environments will also be removed at this time. Beginning December 1, Remote Desktop Protocol (RDP) access for the existing Tier 1 Developer environments, managed by Microsoft, will be removed and decommissioned. Customers will need to preserve or move data within these environments by this date. See the FAQ below with links to existing documentation.
Microsoft will continue to invest in development tools and processes to allow customers to extend the rich capabilities available within Dynamics 365. Learn about one of these key investments, which allows for build automation that uses Microsoft-hosted agents and Azure Pipelines. This approach helps customers avoid the setup, maintenance, and cost of deploying build virtual machines (VMs). It also reuses the existing setup of build agents to run other .Net build automation.
Azure credits will be provided for qualifying customers to use for deploying Tier 1’s using Cloud Hosted Environments. Complete this survey to submit your request.
Sincerely, it’s been a bit of a surprise. We had already been informed of the RDP removal as the email says, and the removal of build VMs has been a rumor for, at least, 2 years. But this is pretty drastic and with such short notice! December is less than two months away!
But wait… instead of speculating, Evaldas Landauskas has asked Microsoft and it looks like the virtual machines won’t be immediately deleted on the 1st but progressively:
FYI: I have asked MS about this topic and this is the answer I have got.
"The process will start with removal of RDP and eventually we will decommission these environments after no activity."
So it won't be decomissioned immediately if it's in use.
Tonight we’ve got a new email from LCS with detailed and updated dates. So finally the dates have been pushed a bit and this is the schedule:
November 1, 2020: no more Tier 1 add-on purchases or deployments. Empty slots will be removed.
December 1, 2020: RDP access will be removed.
January 30, 2021: notices will be sent regarding deallocation and deletion of Tier 1 VMs.
What to do now?
That depends on which use you’re making of that VM and if you have add-on Tier 1 environments. And another thing to ask will probably be the cost of replacing that VM.
I only use it as a build server
If you only have one Tier 1 VM and use it as the build server you have two options:
You will need a VM if you’re running tests or DB sync as a part of your build process. This is the only way. Regarding costs: you could deploy a B8MS VM with 2 128GB Premium SSD disks for around 280€ (330$) per month. You could even try with a B4MS for about 160€/month (190$).
That’s more or less the same price as a Microsoft managed Tier 1 VM. And if you just run tests and DB sync once a day you can even reduce the cost if you start and stop the VM from your pipeline.
If you don’t need that, or want to have a CI build to just compile the code you can just set up the Azure-hosted builds. And if you need extra agents they’re cheaper than any build VM
I use it as a dev VM
If you’re using add-on Microsoft managed VMs for development you need to deploy a new VM in your (or your customer’s) subscription.
Concerned about the extra cost? Don’t be, if you deploy a DS12 V2 VM, with 3 128GB Premium SSD disks, and use it for 8 hours a day, and 20 days per month, you’ll pay around 120€ (140$) per month.
In both cases and if you read the email you’ll see that Microsoft will give out Azure credits in exchange for these VMs, but how many credits is not known yet. I hope this eases the transition but I’m sure there’ll be plenty of complaining 😂
I bet that most of us have had to develop some .NET class library to solve something in Dynamics 365 Finance and Operations. You create a C# project, build it, and add the DLL as a reference in your FnO project. Don’t do that anymore! You can add the .NET project to source control, build it in your pipeline, and the DLL gets added to the deployable package!
I’ve been trying this during the last days after a conversation on Yammer, and while I’ve managed to build .NET and X++ code in the same pipeline, I’ve found some issues or limitations.
If you want to know more about builds, releases, and the Dev ALM of Dynamics 365 you can read my full guide on MSDyn365 & Azure DevOps ALM.
After waiting for it for a long time it’s here! If any of your customers has self-service sandbox environments you’ve been doing this by hand. We’ve been on self-service for over a year and a half with a customer, since the private preview, and we’ve REALLY missed this feature in Azure DevOps.
I’m sorry for my English-speaking readers because, maybe, this post will be a bit useless for you as all the content I’ll talk about is in Spanish. But it’s always good to know!
In the last few days I’ve taken part in a community event, the 365 Saturday online, and I’ve also started a podcast. I want to talk a bit about this.
Dynamics Power Spain Online 2020
This has been my fourth participation as a speaker in the last three years and as usual I’ve presented a session with Juanan. This time we’ve talked about using Azure DevOps with Microsoft Dynamics 365 for Finance and Operations.
It’s a topic I write about a lot, but we really think there’s still many people using it in a wrong way or just using the source control part. And that’s bad!
I’ve written this post after Mötz Jensen asked to in a long and really interesting Twitter discussion on branching and version control in Dynamics 365 for Finance and Operations. This is Denis Trunin‘s tweet that initiated it all:
New blog post – Understanding D365FO Version control system and why it is different from AX2012 https://t.co/S26JU99qtP
Behold #XppGroupies! The day we’ve been waiting for has come! The Azure-hosted builds are in public preview with PU35!! We can now stop asking Joris when will this be available because it already is! Check the docs!
I’ve been able to write this because I’ve been testing it for a few months with access to the private preview. And of course thanks to Joris for inviting us to the preview!
What does this mean? We no longer need a VM to run the build pipelines! Nah, we still need it! If you’re running tests or synchronizing the DB as a part of your build pipeline you still need the VM. But we can move CI builds to the Azure-hosted agent!
Remember this is a public preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited you should see a new LCS project called PEAP Assets, and inside its Asset Library, you’ll find the nugets in the Nuget package section.
The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.
Using the LCS DB API
You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:
And remember: this is currently in private preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited to the Yammer organization you can ask to join the “Self-Service Database Movement / DataALM” group where you’ll get the information to add yourself to the preview and enable it on LCS.