Category

Azure

Category

A short one for today! I stumbled upon this while configuring the new Azure Synapse Link for Dataverse feature for an environment.

Most of the time, when I’m configuring resources on Azure, I do it as a subscription or resource group owner. This means I’m working in god-mode all the time, and I won’t be finding out some issues that users with less access rights do.

After a long, long, long, long, long time waiting for it, the Dynamics 365 F&O local development feature is finally here… in public preview. It’s a bit different since the first time we heard about this at the MBAS in 2019… but it opens up a new scenario for developers.

I won’t go into much detail about the new unified developer experience, which is its name now, but if you want to know what you need to use it and how to configure it, you can read this blog post from Aurélien Clere: Dynamics 365 FinOps Unified developer experience.

Today I’ll explain what Microsoft Dev Box is and how we can use it with the Dynamics 365 F&O local development tools. We will also learn about the cost, and in which scenarios we could benefit from Microsoft Dev Box as our development environment. You know, the best of both worlds, or the worst… 🤣

But first…

Today I’m bringing you a post that’s all about Dynamics 365 F&O, but the main focus is on security, specifically Microsoft Sentinel. Some days ago it was announced a Microsoft Sentinel Solution for F&O had been published and is currently in preview. Let’s learn a bit about it!

Sometimes we overlook the security aspects of the things that are not directly related to F&O, specially regarding resources like networking, storage accounts, dev VMs, Microsoft Entra ID (this is Azure AD’s new name!) or using Bastion.

And we don’t do that because we just don’t care about security, but because we’re Dynamics 365 people, and sometimes we might lack the knowledge in other things. If you’re lucky, you’ll have an in-house security team that’ll take care of that, otherwise we need to train ourselves a bit.

This is the first time I’ve used Microsoft Sentinel, and I’m for sure missing on a lot of things and features. Time to learn!

If you’re integrating Dynamics 365 Finance & Operations with 3rd parties, and your organization or the 3rd party one are using a firewall, you might’ve found yourself in the scenario of being asked “which is the production/sandbox IP address?”.

Well, we don’t know. We know which IP it has now, but we don’t know if it will have the same IP in the future, you will have to monitor this if you plan on opening single IPs. This is something Dag Calafell wrote about on his blog: Static IP not guaranteed for Dynamics 365 for Finance and Operations.

So, what should I do if I have a firewall and need to allow access to/from Dynamics 365 F&O or any other Azure service? The network team usually doesn’t like the answer: if you can’t allow a FQDN, you should open all the address ranges for the datacenter and service you want to access. And that’s a lot of addresses that make the network team sad.

In today’s post, I’ll show you a way to keep an eye on the ranges provided by Microsoft, and hopefully make our life easier.

WARNING: due to this LinkedIn comment, I want to remark that the ranges you can find using this method are for INBOUND communication into Dynamics 365 or whatever service. For outbound communication, check this on Learn: For my Microsoft-managed environments, I have external components that have dependencies on an explicit outbound IP safe list. How can I ensure my service is not impacted after the move to self-service deployment?

We’ve been working with the F&O development VMs for a long time, specially Microsoft partners that need to be able to quickly and easily change between different customer environments, and using the VHD is a bit more complicated in that scenario.

And of course, we use Remote Desktop Protocol to connect to these VMs. RDP is insecure due to its weak encryption, widespread use, and the lack of security features built-in to the protocol. So hackers often target RDP to gain unauthorized access to systems. You can learn more about securing your VMs in Best practices for defending Azure Virtual Machines.

Today, we will walk through the steps of configuring Azure Bastion for Dynamics 365 Finance and Operations development VMs.

In the previous post, we learned about my proposal of an architecture for Azure API Management for integrations. And I closed the post with a pending thing, which was details on how to deploy and configure all the Azure resources the solution uses.

And in today’s post we’re going to see how all the resources are created, linked together and configured to have a fully working API logging solution, and how we can deploy everything using Bicep.

I’m back with additional information about Azure API Management! More Azure content, and I’ll probably continue to produce posts regarding Azure in the future.

I believe there are numerous ways to learn new things, and for me, two of them are writing blog posts and using new technologies to solve problems at work. Of course, my goal is to attempt to apply the Azure themes I write about to Dynamics 365.

Today, I’m presenting an architecture approach for integrations, leveraging API Management and various other Azure components, for Dynamics 365 or anything else that has an endpoint.

You can read the second part of this post: IaC with Bicep: deploy Azure API Management Architecture.

Let’s talk about logs in Dynamics 365 Finance and Operations. And I don’t mean the built-in database logs we’ve had since the old Axapta days. I’m talking about plain logs, a table and a form to see how/why data is changing, or logging external calls to OData or custom web services endpoints in the ERP.

It’s something that I’m sure almost all developers at some point have had to do. The customer wants to record CRUD events, you suggest enabling the DB log, but the customer wants a new form to see the data. Or maybe you’re monitoring all the calls to your custom web services.

You can read my complete ALM guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

Moving data from the production to a sandbox environment is something we regularly have to do to have real updated data to do some testing or debugging. It’s a process that takes time and that can be automated as I explained in the post LCS DB API: automating Prod to Dev DB copies.

In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.

You can learn more about the DMF REST API, which I’ll use, reading this post from Fabio Filardi: Dynamics 365 FinOps: Batch import automation with Azure Functions, Business Events and PowerBI.

You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:

ariste.info