Author

Adrià Ariste Santacreu

Browsing

If you’re integrating Dynamics 365 Finance & Operations with 3rd parties, and your organization or the 3rd party one are using a firewall, you might’ve found yourself in the scenario of being asked “which is the production/sandbox IP address?”.

Well, we don’t know. We know which IP it has now, but we don’t know if it will have the same IP in the future, you will have to monitor this if you plan on opening single IPs. This is something Dag Calafell wrote about on his blog: Static IP not guaranteed for Dynamics 365 for Finance and Operations.

So, what should I do if I have a firewall and need to allow access to/from Dynamics 365 F&O or any other Azure service? The network team usually doesn’t like the answer: if you can’t allow a FQDN, you should open all the address ranges for the datacenter and service you want to access. And that’s a lot of addresses that make the network team sad.

In today’s post, I’ll show you a way to keep an eye on the ranges provided by Microsoft, and hopefully make our life easier.

WARNING: due to this LinkedIn comment, I want to remark that the ranges you can find using this method are for INBOUND communication into Dynamics 365 or whatever service. For outbound communication, check this on Learn: For my Microsoft-managed environments, I have external components that have dependencies on an explicit outbound IP safe list. How can I ensure my service is not impacted after the move to self-service deployment?

It’s already that time of the year again, the Dynamics 365 and Power Platform 2023 Release Wave 1 plans have just been released!

The 2023 release wave 1 for Microsoft Dynamics 365 Finance and Operations is focused on improving the development, administration, and user experiences by removing barriers, tightening integrations, and enhancing cross-platform capabilities. This release will bring a range of new features and capabilities that will help improve the performance of the platform and enhance the overall experience for developers, administrators, and end-users.

The features described here are planned to be delivered from April to September 2023.

We’ve been working with the F&O development VMs for a long time, specially Microsoft partners that need to be able to quickly and easily change between different customer environments, and using the VHD is a bit more complicated in that scenario.

And of course, we use Remote Desktop Protocol to connect to these VMs. RDP is insecure due to its weak encryption, widespread use, and the lack of security features built-in to the protocol. So hackers often target RDP to gain unauthorized access to systems. You can learn more about securing your VMs in Best practices for defending Azure Virtual Machines.

Today, we will walk through the steps of configuring Azure Bastion for Dynamics 365 Finance and Operations development VMs.

In the previous post, we learned about my proposal of an architecture for Azure API Management for integrations. And I closed the post with a pending thing, which was details on how to deploy and configure all the Azure resources the solution uses.

And in today’s post we’re going to see how all the resources are created, linked together and configured to have a fully working API logging solution, and how we can deploy everything using Bicep.

I’m back with additional information about Azure API Management! More Azure content, and I’ll probably continue to produce posts regarding Azure in the future.

I believe there are numerous ways to learn new things, and for me, two of them are writing blog posts and using new technologies to solve problems at work. Of course, my goal is to attempt to apply the Azure themes I write about to Dynamics 365.

Today, I’m presenting an architecture approach for integrations, leveraging API Management and various other Azure components, for Dynamics 365 or anything else that has an endpoint.

You can read the second part of this post: IaC with Bicep: deploy Azure API Management Architecture.

We’ve seen a lot of improvements to the product since it was released as AX7, and some name changes too 😝 And one of the areas where we’ve seen more enhancements is the batch framework.

And since the currently live last version of Dynamics 365 Finance and Operations, 10.0.28, we have a new feature that will be enabled by default for all new instances, and it will be enabled for all existing instances in 10.0.29: the priority-based batch scheduling.

Let’s talk about logs in Dynamics 365 Finance and Operations. And I don’t mean the built-in database logs we’ve had since the old Axapta days. I’m talking about plain logs, a table and a form to see how/why data is changing, or logging external calls to OData or custom web services endpoints in the ERP.

It’s something that I’m sure almost all developers at some point have had to do. The customer wants to record CRUD events, you suggest enabling the DB log, but the customer wants a new form to see the data. Or maybe you’re monitoring all the calls to your custom web services.

So three days ago I took a look at the new 10.0.25 features currently in the PEAP program, calmly scrolling through until I saw something about custom scripts on prod, something that read:

Run custom X++ scripts with zero downtime

You can imagine my face when I read this. At the beginning I was confused, then surprised and then confused again. After reading the description, the situation wouldn’t get better:

This feature lets you upload and run deployable packages that contain custom X++ scripts without having to go through Microsoft Dynamics Lifecycle Services (LCS) or suspend your system. Therefore, you can correct minor data inconsistences without causing any disruptive downtime.

Are we getting a way to run custom code in production without having to deploy it? Yes we are. As many people have said these past two days: “X++ jobs are back!”. And there are a lot of discussions going on about the custom scripts feature.