If you’re integrating Dynamics 365 Finance & Operations with 3rd parties, and your organization or the 3rd party one are using a firewall, you might’ve found yourself in the scenario of being asked “which is the production/sandbox IP address?”.
So, what should I do if I have a firewall and need to allow access to/from Dynamics 365 F&O or any other Azure service? The network team usually doesn’t like the answer: if you can’t allow a FQDN, you should open all the address ranges for the datacenter and service you want to access. And that’s a lot of addresses that make the network team sad.
In today’s post, I’ll show you a way to keep an eye on the ranges provided by Microsoft, and hopefully make our life easier.
We’ve been working with the F&O development VMs for a long time, specially Microsoft partners that need to be able to quickly and easily change between different customer environments, and using the VHD is a bit more complicated in that scenario.
And of course, we use Remote Desktop Protocol to connect to these VMs. RDP is insecure due to its weak encryption, widespread use, and the lack of security features built-in to the protocol. So hackers often target RDP to gain unauthorized access to systems. You can learn more about securing your VMs in Best practices for defending Azure Virtual Machines.
Today, we will walk through the steps of configuring Azure Bastion for Dynamics 365 Finance and Operations development VMs.
In the previous post, we learned about my proposal of an architecture for Azure API Management for integrations. And I closed the post with a pending thing, which was details on how to deploy and configure all the Azure resources the solution uses.
And in today’s post we’re going to see how all the resources are created, linked together and configured to have a fully working API logging solution, and how we can deploy everything using Bicep.
I’m back with additional information about Azure API Management! More Azure content, and I’ll probably continue to produce posts regarding Azure in the future.
I believe there are numerous ways to learn new things, and for me, two of them are writing blog posts and using new technologies to solve problems at work. Of course, my goal is to attempt to apply the Azure themes I write about to Dynamics 365.
Today, I’m presenting an architecture approach for integrations, leveraging API Management and various other Azure components, for Dynamics 365 or anything else that has an endpoint.
Let’s talk about logs in Dynamics 365 Finance and Operations. And I don’t mean the built-in database logs we’ve had since the old Axapta days. I’m talking about plain logs, a table and a form to see how/why data is changing, or logging external calls to OData or custom web services endpoints in the ERP.
It’s something that I’m sure almost all developers at some point have had to do. The customer wants to record CRUD events, you suggest enabling the DB log, but the customer wants a new form to see the data. Or maybe you’re monitoring all the calls to your custom web services.
It’s time to stop doing this, or at least to change how we do it.
It’s been a bit over 6 years since Microsoft released Dynamics AX7 in 2016. During this time we’ve gone through some name changes. And all of those who were already working with AX 2009 or AX 2012 have had to go through some habit changes too.
We’re slowly shifting to a true SaaS model for Finance and Operations, and one of the features of SaaS is that you pay for a specific amount of services. And this includes database space. This is something our CRM/Power Platform/Dataverse colleagues know much better than us.
If you check the Dynamics 365 licensing guide (I hope this will be the only time I write about licensing) you can see that the Finance and Operations database has a limited assigned amount of space:
Yes, that’s right, only 20 GB of base DB capacity, plus extra capacity per each user license. And everything that exceeds that capacity will be billed. I haven’t found the price in the latest version of the licensing guide, but the last time I saw it, it was like €40/GB. But I haven’t heard of anyone being billed yet.
This means that we need to be careful regarding what we create in the ERP. And for me, one of the things we need to do is getting rid of logs inside Dynamics 365 F&O.
What to use instead? That’s up to you, but there are plenty of options without going outside the Microsoft ecosystem. And of course for a fraction of the price you’d pay for a single GB of extra DB capacity.
Azure Monitor is a bigger and more complete (and complex) solution for monitoring. If you’re already using it, you could integrate your Dynamics 365 logs here, but otherwise I wouldn’t choose it as the first option when creating logs.
And of course we also have old-style file-based logs, which could also be the cheapest solution using Azure Blob Storage. Of course, file-based logs aren’t the easiest to read, but there’s plenty of software that helps with that.
Whatever is the solution you choose, even if it’s none of the above, the way of implementing it should be more or less the same, and deciding which data you want to log and which format this log will have should be your first steps.
Then, depending on which log tools you’re going to use, you’ll need to create some kind of API to send the data to that service.
A final thought
I’ve been thinking about this a lot lately. And also about putting an API Management in front of each Dynamics 365 instance. A lot. This could be another step towards a real cloud ERP that takes advantage of being hosted in Azure and uses more of the tools it has.
Is it extra work? Yes, it is. Is this extra work worth the effort? In my opinion, it is.
The APIM made me realize the logging thing. When you configure an APIM you can enable Application Insights. Then you can track the requests that get to the APIM, the request that goes into Dynamics 365, the response from Dynamics 365 to the APIM and the response from the APIM to the caller. And have complete detail of every step!
So, why not doing the same when logging things inside Dynamics 365? I think that knowing why something is going wrong is important, and that’s why we sometimes need logs, and using tools like Application Insights that have far more capacities than a D365 table and form is a much better idea.
In today’s post, I want to talk about using Azure API Management (APIM) along Dynamics 365 Finance and Operations.
Azure API Management is a hybrid, multi-cloud management platform for APIs across all environments. This means that, after deploying an APIM account, you can create an API that can serve services from one system or multiple.
In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.
If you receive the LCS email notifications for your projects you already know this: all Tier 1 virtual machines from Microsoft’s subscription will be gone as early as 1 December!
This is what the emails say:
As communicated previously, Microsoft is removing the use of Remote Desktop Protocol (RDP) to access environments managed by Microsoft. As RDP access is required for development, going forward customers will be required to develop using a Cloud Hosted Environment or download a local “Virtual Hard Disk” (VHD) within Lifecycle Services. Cloud Hosted Environments will allow customers to manage the compute, size, and cost of these environments. This infrastructure change will ensure that customers decouple development tools from any running environment.
In addition, effective November 1, Tier 1 environments will not be included in the purchase of Dynamics 365 Finance, Dynamics 365 Supply Chain Management, Dynamics 365 Project Operations, or Dynamics 365 Commerce apps. The ability to purchase additional Add-On tier 1 environments will also be removed at this time. Beginning December 1, Remote Desktop Protocol (RDP) access for the existing Tier 1 Developer environments, managed by Microsoft, will be removed and decommissioned. Customers will need to preserve or move data within these environments by this date. See the FAQ below with links to existing documentation.
Microsoft will continue to invest in development tools and processes to allow customers to extend the rich capabilities available within Dynamics 365. Learn about one of these key investments, which allows for build automation that uses Microsoft-hosted agents and Azure Pipelines. This approach helps customers avoid the setup, maintenance, and cost of deploying build virtual machines (VMs). It also reuses the existing setup of build agents to run other .Net build automation.
Azure credits will be provided for qualifying customers to use for deploying Tier 1’s using Cloud Hosted Environments. Complete this survey to submit your request.
Sincerely, it’s been a bit of a surprise. We had already been informed of the RDP removal as the email says, and the removal of build VMs has been a rumor for, at least, 2 years. But this is pretty drastic and with such short notice! December is less than two months away!
But wait… instead of speculating, Evaldas Landauskas has asked Microsoft and it looks like the virtual machines won’t be immediately deleted on the 1st but progressively:
Tonight we’ve got a new email from LCS with detailed and updated dates. So finally the dates have been pushed a bit and this is the schedule:
November 1, 2020: no more Tier 1 add-on purchases or deployments. Empty slots will be removed.
December 1, 2020: RDP access will be removed.
January 30, 2021: notices will be sent regarding deallocation and deletion of Tier 1 VMs.
What to do now?
That depends on which use you’re making of that VM and if you have add-on Tier 1 environments. And another thing to ask will probably be the cost of replacing that VM.
I only use it as a build server
If you only have one Tier 1 VM and use it as the build server you have two options:
You will need a VM if you’re running tests or DB sync as a part of your build process. This is the only way. Regarding costs: you could deploy a B8MS VM with 2 128GB Premium SSD disks for around 280€ (330$) per month. You could even try with a B4MS for about 160€/month (190$).
If you don’t need that, or want to have a CI build to just compile the code you can just set up the Azure-hosted builds. And if you need extra agents they’re cheaper than any build VM
I use it as a dev VM
If you’re using add-on Microsoft managed VMs for development you need to deploy a new VM in your (or your customer’s) subscription.
Concerned about the extra cost? Don’t be, if you deploy a DS12 V2 VM, with 3 128GB Premium SSD disks, and use it for 8 hours a day, and 20 days per month, you’ll pay around 120€ (140$) per month.
In both cases and if you read the email you’ll see that Microsoft will give out Azure credits in exchange for these VMs, but how many credits is not known yet. I hope this eases the transition but I’m sure there’ll be plenty of complaining 😂
This is another post about solving Dynamics 365 problems using external tools. However I’m starting to think as everything Azure-related as not external at all. In this case I’ll show different scenarios using Azure functions with Dynamics 365.
A Key Vault is a service that allows us to safely store certificates or secrets and later use them in our applications and services. And like many other Azure services it has a cost but it’s really low and, for a normal use, you will be billed like a cent or none a month. Don’t be stingy with security!