Category

Azure

Category

If you receive the LCS email notifications for your projects you already know this: all Tier 1 virtual machines from Microsoft’s subscription will be gone as early as 1 December!

Tier 1 VMs will be gone
What do you mean gone?

This is what the emails say:

As communicated previously, Microsoft is removing the use of Remote Desktop Protocol (RDP) to access environments managed by Microsoft. As RDP access is required for development, going forward customers will be required to develop using a Cloud Hosted Environment or download a local “Virtual Hard Disk” (VHD) within Lifecycle Services. Cloud Hosted Environments will allow customers to manage the compute, size, and cost of these environments. This infrastructure change will ensure that customers decouple development tools from any running environment.

In addition, effective November 1, Tier 1 environments will not be included in the purchase of Dynamics 365 Finance, Dynamics 365 Supply Chain Management, Dynamics 365 Project Operations, or Dynamics 365 Commerce apps. The ability to purchase additional Add-On tier 1 environments will also be removed at this time. Beginning December 1, Remote Desktop Protocol (RDP) access for the existing Tier 1 Developer environments, managed by Microsoft, will be removed and decommissioned. Customers will need to preserve or move data within these environments by this date. See the FAQ below with links to existing documentation.

Microsoft will continue to invest in development tools and processes to allow customers to extend the rich capabilities available within Dynamics 365. Learn about one of these key investments, which allows for build automation that uses Microsoft-hosted agents and Azure Pipelines. This approach helps customers avoid the setup, maintenance, and cost of deploying build virtual machines (VMs). It also reuses the existing setup of build agents to run other .Net build automation.

Azure credits will be provided for qualifying customers to use for deploying Tier 1’s using Cloud Hosted Environments. Complete this survey to submit your request.

Sincerely, it’s been a bit of a surprise. We had already been informed of the RDP removal as the email says, and the removal of build VMs has been a rumor for, at least, 2 years. But this is pretty drastic and with such short notice! December is less than two months away!

But wait… instead of speculating, Evaldas Landauskas has asked Microsoft and it looks like the virtual machines won’t be immediately deleted on the 1st but progressively:

Update!

Tonight we’ve got a new email from LCS with detailed and updated dates. So finally the dates have been pushed a bit and this is the schedule:

  • November 1, 2020: no more Tier 1 add-on purchases or deployments. Empty slots will be removed.
  • December 1, 2020: RDP access will be removed.
  • January 30, 2021: notices will be sent regarding deallocation and deletion of Tier 1 VMs.

What to do now?

That depends on which use you’re making of that VM and if you have add-on Tier 1 environments. And another thing to ask will probably be the cost of replacing that VM.

I only use it as a build server

If you only have one Tier 1 VM and use it as the build server you have two options:

You will need a VM if you’re running tests or DB sync as a part of your build process. This is the only way. Regarding costs: you could deploy a B8MS VM with 2 128GB Premium SSD disks for around 280€ (330$) per month. You could even try with a B4MS for about 160€/month (190$).

That’s more or less the same price as a Microsoft managed Tier 1 VM. And if you just run tests and DB sync once a day you can even reduce the cost if you start and stop the VM from your pipeline.

If you don’t need that, or want to have a CI build to just compile the code you can just set up the Azure-hosted builds. And if you need extra agents they’re cheaper than any build VM

I use it as a dev VM

If you’re using add-on Microsoft managed VMs for development you need to deploy a new VM in your (or your customer’s) subscription.

Concerned about the extra cost? Don’t be, if you deploy a DS12 V2 VM, with 3 128GB Premium SSD disks, and use it for 8 hours a day, and 20 days per month, you’ll pay around 120€ (140$) per month.

In both cases and if you read the email you’ll see that Microsoft will give out Azure credits in exchange for these VMs, but how many credits is not known yet. I hope this eases the transition but I’m sure there’ll be plenty of complaining 😂

You can read about other use scenarios of the Tier 1 VM in Nathan Clouse‘s blog.

This is another post about solving Dynamics 365 problems using external tools. However I’m starting to think as everything Azure-related as not external at all. In this case I’ll show different scenarios using Azure functions with Dynamics 365.

I wrote this almost three weeks ago and it was intended as a two-part post but after seeing this wonderful blog post about Azure Functions from Fabio Filardi I don’t know what else could I add to it and will probably leave it here. Go check it!

In this post we’ll see what Azure Functions are and how to create one locally and deploy it to Azure.

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

After the update of my last post about calling the LCS API from Azure DevOps Pipelines I thought that creating a pipeline with a password in plain sight was not very secure. How could we add extra security to a pipeline? Once again we can turn to an Azure tool to help us, the Azure Key Vault.

Azure Key Vault

A Key Vault is a service that allows us to safely store certificates or secrets and later use them in our applications and services. And like many other Azure services it has a cost but it’s really low and, for a normal use, you will be billed like a cent or none a month. Don’t be stingy with security!

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

I talked about the LCS Database Movement API in a post not long ago, and in this one I’ll show how to call the API using PowerShell from your Azure DevOps Pipelines.

What for?

Basically, automation. Right now the API only allows the refresh from one Microsoft Dynamics 365 for Finance and Operations environment to another, so the idea is having fresh data from production in our UAT environments daily. I don’t know which new operations the API will support in the future but another idea could be adding the DB export operation (creating a bacpac) to the pipeline and having a copy of prod ready to be restored in a Dev environment.

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

During this past night (at least it was night for me :P) the new tasks for Azure DevOps to deploy the packages and update model versions have been published:

There’s an announcement in the Community blogs too with extended details on setting it up. Let´s see the new tasks and how to set them up.

First of all… DISCLAIMER: think twice before using this on a productive environment. Then think again. And if you finally decide to use it, do it in the most cautious and light way.

Why does this deserve a disclaimer? Well, even though the docs state that the system performance should not be impacted, I don’t really know its true impact. Plus it’s on an ERP. On one in which we don’t have access to the production environment (unless you’re On-Prem) to certify that there’s no performance degradation. And because probably Microsoft’s already using it to collect data from the environments to show up in LCS, and I don’t know if it could interfere on it. A lot of I-don’t-knows.

Would I use it on production? YES. It will be really helpful in some cases.

With that said, what am I going to write about that needs a disclaimer? As the title says, about using Azure Application Insights in Microsoft Dynamics 365 for Finance and Operations. This post is just one of the “Have you seen that? Yeah, we should try it!” consequences of Juanan (and on Twitter, follow him!) and me talking. And the that this time was this post from Lane Swenka on AX Developer Connection. So nothing original here 🙂

Azure Application Insights

I spy
I spy!! Made by Cazapelusas

What’s Application Insights? As the documentation says:

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your blah web application. It will blah blah detect blaaah anomalies. It blah powerful blahblah tools to bleh blah blih and blah blah blaaaah. It’s blaaaaaaaah.

Mmmm… you better watch this video:

So much misery and sadness in the first 30 seconds…

Monitoring. That’s what it does and is for. “LCS already does that!“. OK, extra monitoring! Everybody loves extra, like in pizzas, unless it’s pinneapple, of course.

Getting it to work

The first step will be to create an Application Insights resource on our Azure subscription. Regarding pricing: the first 5GB per month are free and data will be retained for 90 days. More details here.

Then we need the code. I’ll skip the details in this part because it’s perfectly detailed in the link above (this one), just follow the steps. You basically need to create a DLL library to handle the events and send data to AAI and use it from MSDyn365FO. In our version we’ve additionally added the trackTrace method to the C# library. Then just add a reference to the DLL in your MSDyn365FO Visual Studio project and it’s ready to use.

What can we measure?

And now the interesting part (I hope). Page views, capture errors (or all infologs), batch executions, field value changes, and anything else you can extend and call our API methods.

For example, we can extend the FormDataUtil class from the forms engine. This class has several methods that are called from forms in different actions on the data sources, like validating the write, delete, field validations, etc… And also this:

modifiedField in FormDataUtils

This will run after a form field value is modified. We’ll extend it to log which field is having it’s value changed, the old and new value. Like this:

Extending modifiedField
I promise I always use labels!

And because the Application Insights call will also store the user that triggered the value change, we just got a new database log! Even better, we got a new database log that has no performance consequences because there’s no extra data to be generated on MSDyn365FO’s side. The only drawback in this is that it will only be called from forms, but it might be enough to monitor the usage of forms and counter the “no, I haven’t changed any parameter” 🙂

This is what we get on Azure’s Application Insights metrics explorer:

Azure Application Insights Custom Event
What dou you mean I changed that?!

Yes you did, Admin! Ooops it’s me…

Custom events

We’re storing the AOS name too and if the call was originated in a Batch.

All the metrics from our events will display on Azure and the data can be displayed later in Power BI, if you feel like doing it.

With this example you can go on and add calls to the extended objects where you need it. Batches, integrations, critical processes, etc…

Again, please plan what you want to monitor before using this and test it. Then test it again, especially on SAT environments with Azure SQL databases which perform a bit different than the regular SQL Server ones.

And enjoy the data!

Do NOT follow this link or you will be banned from the site!