Azure functions & Dynamics 365 Finance and Operations

This is another post about solving Dynamics 365 problems using external tools. However I’m starting to think as everything Azure-related as not external at all. In this case I’ll show different scenarios using Azure functions with Dynamics 365.

I wrote this almost three weeks ago and it was intended as a two-part post but after seeing this wonderful blog post about Azure Functions from Fabio Filardi I don’t know what else could I add to it and will probably leave it here. Go check it!

In this post we’ll see what Azure Functions are and how to create one locally and deploy it to Azure.

Azure functions

Azure functions are a serverless compute service that allow us to run code without deploying it to any server (because it’s serverless). It’s a wonderful tool that allows us to write code using .NET, JavaScript or Java, deploy it and run it in seconds.

The functions can be triggered by different events like HTTP, queues, Azure Event Hub and Grid or Service Bus, amongst others. These events will make the function run our code. Let me show you how to quickly create, deploy and run an Azure Function triggered by a POST HTTP call.

Note: Azure Functions can also be created from the Azure portal but using Visual Studio is more powerful (and easier in my my opinion) and will allow us to add the code to Azure DevOps.

Create a new Visual Studio project and select Azure Functions:

Azure Functions in Visual Studio 2019
Azure Functions in Visual Studio 2019

Give it a name and in the triggers select the Http trigger:

Azure Functions triggers
Azure Functions triggers

The solution will open with some sample code. We can run this code locally just by pressing F5:

Azure Functions running locally
Azure Functions running locally

You can see that after running the code I get a local URL I can query to trigger it. We can do this using Postman.

A quick trick!

But what if we needed to test this function from an external service that can’t make an HTTP request to our localhost? Let me show a little trick Juanan showed me: use ngrok.

ngrok will create a tunnel and give you a public URL to your machine to call the function. Download it, unzip it and put ngrok.exe where you want it to run. Then open a command prompt and run this:

Where 7071 should be the same port your function is running on. You’ll get this:

Azure functions & Dynamics 365 Finance and Operations 1
Azure Functions and ngrok, the prefect couple

Now you could call your local function from anywhere using the public URL ngrok has generated. You’ll see all the calls made in the prompt. ngrok is just great! I love it!

Calling the function

First, let’s check the code:

This function accepts GET and POST calls, and if you pass a value with the ‘name’ query parameter in the URL you will get a different response than if you pass nothing.

I’ll call the local URL passing a parameter and see what we get:

Azure functions & Dynamics 365 Finance and Operations 2
Azure functions response

Regarding code, we can do exactly the same things we’d do in any other .NET project for example. We can add all sorts of libraries using nuget, create additional classes and use them in the function, etc. Azure functions can help us solve plenty of problems, and its cost is ridiculous! With the free tier we have 1 million executions included, FREE! The only cost you’d incur into would be the storage account that’s created, but we’re talking about cents/month.

Deploy the function

Azure functions & Dynamics 365 Finance and Operations 3
I’m right-clicking publishing!

To deploy the function we need to create it in Azure portal first. Once it’s done go to Visual Studio and right click, publish the function.

I think this is the only time we can right click, publish without anybody wanting to kill us.

Now select a Consumption plan and “Select existing”:

Azure functions & Dynamics 365 Finance and Operations 4

Create the profile and you’ll be asked to select an Azure subscription and resource group. Select the subscription where you’ve created the Function App and its resource group:

Azure functions publish
Azure functions publish

And finally click the Publish button and your function will be deployed to Azure.

Publish the function!
Publish the function!

Go to the Azure portal, select the function you created before and go to “Functions”:

Azure functions & Dynamics 365 Finance and Operations 5

There you’ll see your function, select it and in the new window click on “Get Function Url”:

Azure functions & Dynamics 365 Finance and Operations 6

Copy the function URL and call it from Postman, or a browser as it also accepts GET requests, and pass it the name parameter too:

Azure functions & Dynamics 365 Finance and Operations 7

It works! We’ve created the Azure Function in a local solution in Visual Studio, tested it locally and finally deployed it to Azure and tested it again, now in the cloud. Aren’t Azure functions cool? Azure Functions can also be secured using AAD, Azure Key Vault, you can monitor them using Application Insights and many other options.

One final remark: when we deploy an Azure Function like we’ve done we cannot edit the code on Azure. We need to do the changes in Visual Studio and deploy it again.

Secure your Azure Pipelines with Azure Key Vault

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

After the update of my last post about calling the LCS API from Azure DevOps Pipelines I thought that creating a pipeline with a password in plain sight was not very secure. How could we add extra security to a pipeline? Once again we can turn to an Azure tool to help us, the Azure Key Vault.

Azure Key Vault

A Key Vault is a service that allows us to safely store certificates or secrets and later use them in our applications and services. And like many other Azure services it has a cost but it’s really low and, for a normal use, you will be billed like a cent or none a month. Don’t be stingy with security!

Continue reading “Secure your Azure Pipelines with Azure Key Vault”

Calling the LCS Database Movement API in your Azure DevOps pipeline

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

I talked about the LCS Database Movement API in a post not long ago, and in this one I’ll show how to call the API using PowerShell from your Azure DevOps Pipelines.

What for?

Basically, automation. Right now the API only allows the refresh from one Microsoft Dynamics 365 for Finance and Operations environment to another, so the idea is having fresh data from production in our UAT environments daily. I don’t know which new operations the API will support in the future but another idea could be adding the DB export operation (creating a bacpac) to the pipeline and having a copy of prod ready to be restored in a Dev environment.

Continue reading “Calling the LCS Database Movement API in your Azure DevOps pipeline”

Set up the new Azure DevOps tasks for Packaging and Model Versioning

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

During this past night (at least it was night for me :P) the new tasks for Azure DevOps to deploy the packages and update model versions have been published:

There’s an announcement in the Community blogs too with extended details on setting it up. Let´s see the new tasks and how to set them up.

Continue reading “Set up the new Azure DevOps tasks for Packaging and Model Versioning”

Using Azure Application Insights with MSDyn365FO

First of all… DISCLAIMER: think twice before using this on a productive environment. Then think again. And if you finally decide to use it, do it in the most cautious and light way.

Why does this deserve a disclaimer? Well, even though the docs state that the system performance should not be impacted, I don’t really know its true impact. Plus it’s on an ERP. On one in which we don’t have access to the production environment (unless you’re On-Prem) to certify that there’s no performance degradation. And because probably Microsoft’s already using it to collect data from the environments to show up in LCS, and I don’t know if it could interfere on it. A lot of I-don’t-knows.

Would I use it on production? YES. It will be really helpful in some cases.

With that said, what am I going to write about that needs a disclaimer? As the title says, about using Azure Application Insights in Microsoft Dynamics 365 for Finance and Operations. This post is just one of the “Have you seen that? Yeah, we should try it!” consequences of Juanan (and on Twitter, follow him!) and me talking. And the that this time was this post from Lane Swenka on AX Developer Connection. So nothing original here 🙂

Azure Application Insights

I spy
I spy!! Made by Cazapelusas

What’s Application Insights? As the documentation says:

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your blah web application. It will blah blah detect blaaah anomalies. It blah powerful blahblah tools to bleh blah blih and blah blah blaaaah. It’s blaaaaaaaah.

Mmmm… you better watch this video:

So much misery and sadness in the first 30 seconds…

Monitoring. That’s what it does and is for. “LCS already does that!“. OK, extra monitoring! Everybody loves extra, like in pizzas, unless it’s pinneapple, of course.

Getting it to work

The first step will be to create an Application Insights resource on our Azure subscription. Regarding pricing: the first 5GB per month are free and data will be retained for 90 days. More details here.

Then we need the code. I’ll skip the details in this part because it’s perfectly detailed in the link above (this one), just follow the steps. You basically need to create a DLL library to handle the events and send data to AAI and use it from MSDyn365FO. In our version we’ve additionally added the trackTrace method to the C# library. Then just add a reference to the DLL in your MSDyn365FO Visual Studio project and it’s ready to use.

What can we measure?

And now the interesting part (I hope). Page views, capture errors (or all infologs), batch executions, field value changes, and anything else you can extend and call our API methods.

For example, we can extend the FormDataUtil class from the forms engine. This class has several methods that are called from forms in different actions on the data sources, like validating the write, delete, field validations, etc… And also this:

modifiedField in FormDataUtils

This will run after a form field value is modified. We’ll extend it to log which field is having it’s value changed, the old and new value. Like this:

Extending modifiedField
I promise I always use labels!

And because the Application Insights call will also store the user that triggered the value change, we just got a new database log! Even better, we got a new database log that has no performance consequences because there’s no extra data to be generated on MSDyn365FO’s side. The only drawback in this is that it will only be called from forms, but it might be enough to monitor the usage of forms and counter the “no, I haven’t changed any parameter” 🙂

This is what we get on Azure’s Application Insights metrics explorer:

Azure Application Insights Custom Event
What dou you mean I changed that?!

Yes you did, Admin! Ooops it’s me…

Custom events

We’re storing the AOS name too and if the call was originated in a Batch.

All the metrics from our events will display on Azure and the data can be displayed later in Power BI, if you feel like doing it.

With this example you can go on and add calls to the extended objects where you need it. Batches, integrations, critical processes, etc…

Again, please plan what you want to monitor before using this and test it. Then test it again, especially on SAT environments with Azure SQL databases which perform a bit different than the regular SQL Server ones.

And enjoy the data!