Azure functions & Dynamics 365 Finance and Operations

This is another post about solving Dynamics 365 problems using external tools. However I’m starting to think as everything Azure-related as not external at all. In this case I’ll show different scenarios using Azure functions with Dynamics 365.

I wrote this almost three weeks ago and it was intended as a two-part post but after seeing this wonderful blog post about Azure Functions from Fabio Filardi I don’t know what else could I add to it and will probably leave it here. Go check it!

In this post we’ll see what Azure Functions are and how to create one locally and deploy it to Azure.

Azure functions

Azure functions are a serverless compute service that allow us to run code without deploying it to any server (because it’s serverless). It’s a wonderful tool that allows us to write code using .NET, JavaScript or Java, deploy it and run it in seconds.

The functions can be triggered by different events like HTTP, queues, Azure Event Hub and Grid or Service Bus, amongst others. These events will make the function run our code. Let me show you how to quickly create, deploy and run an Azure Function triggered by a POST HTTP call.

Note: Azure Functions can also be created from the Azure portal but using Visual Studio is more powerful (and easier in my my opinion) and will allow us to add the code to Azure DevOps.

Create a new Visual Studio project and select Azure Functions:

Azure Functions in Visual Studio 2019
Azure Functions in Visual Studio 2019

Give it a name and in the triggers select the Http trigger:

Azure Functions triggers
Azure Functions triggers

The solution will open with some sample code. We can run this code locally just by pressing F5:

Azure Functions running locally
Azure Functions running locally

You can see that after running the code I get a local URL I can query to trigger it. We can do this using Postman.

A quick trick!

But what if we needed to test this function from an external service that can’t make an HTTP request to our localhost? Let me show a little trick Juanan showed me: use ngrok.

ngrok will create a tunnel and give you a public URL to your machine to call the function. Download it, unzip it and put ngrok.exe where you want it to run. Then open a command prompt and run this:

Where 7071 should be the same port your function is running on. You’ll get this:

Azure functions & Dynamics 365 Finance and Operations 1
Azure Functions and ngrok, the prefect couple

Now you could call your local function from anywhere using the public URL ngrok has generated. You’ll see all the calls made in the prompt. ngrok is just great! I love it!

Calling the function

First, let’s check the code:

This function accepts GET and POST calls, and if you pass a value with the ‘name’ query parameter in the URL you will get a different response than if you pass nothing.

I’ll call the local URL passing a parameter and see what we get:

Azure functions & Dynamics 365 Finance and Operations 2
Azure functions response

Regarding code, we can do exactly the same things we’d do in any other .NET project for example. We can add all sorts of libraries using nuget, create additional classes and use them in the function, etc. Azure functions can help us solve plenty of problems, and its cost is ridiculous! With the free tier we have 1 million executions included, FREE! The only cost you’d incur into would be the storage account that’s created, but we’re talking about cents/month.

Deploy the function

Azure functions & Dynamics 365 Finance and Operations 3
I’m right-clicking publishing!

To deploy the function we need to create it in Azure portal first. Once it’s done go to Visual Studio and right click, publish the function.

I think this is the only time we can right click, publish without anybody wanting to kill us.

Now select a Consumption plan and “Select existing”:

Azure functions & Dynamics 365 Finance and Operations 4

Create the profile and you’ll be asked to select an Azure subscription and resource group. Select the subscription where you’ve created the Function App and its resource group:

Azure functions publish
Azure functions publish

And finally click the Publish button and your function will be deployed to Azure.

Publish the function!
Publish the function!

Go to the Azure portal, select the function you created before and go to “Functions”:

Azure functions & Dynamics 365 Finance and Operations 5

There you’ll see your function, select it and in the new window click on “Get Function Url”:

Azure functions & Dynamics 365 Finance and Operations 6

Copy the function URL and call it from Postman, or a browser as it also accepts GET requests, and pass it the name parameter too:

Azure functions & Dynamics 365 Finance and Operations 7

It works! We’ve created the Azure Function in a local solution in Visual Studio, tested it locally and finally deployed it to Azure and tested it again, now in the cloud. Aren’t Azure functions cool? Azure Functions can also be secured using AAD, Azure Key Vault, you can monitor them using Application Insights and many other options.

One final remark: when we deploy an Azure Function like we’ve done we cannot edit the code on Azure. We need to do the changes in Visual Studio and deploy it again.

How do we do branching in Dynamics 365?

I’ve written this post after Mötz Jensen asked to in a long and really interesting Twitter discussion on branching and version control in Dynamics 365 for Finance and Operations. This is Denis Trunin‘s tweet that initiated it all:

Just go and read all the replies, there’s some of the most interesting content I’ve read on Twitter in months.

Branching in MSDyn365FO
Branching in MSDyn365FO

You can also check my MSDyn365FO & Azure DevOps ALM guide too!

Branching

When deciding which branching strategy to use, you have to think what will suit your team and follow the KISS principle. Keep your branching strategy as simple as you can! You don’t want to spend most of your day managing the code instead of coding, right? Or trying to fix bad merges…

Also keep in mind that your strategy might not be the same if you’re working on a a customer implementation project or developing an ISV.

Main – Release

This one of the most simple strategies. You start your work on the Main branch, all developers working with it. You keep working only with Main until Go-live.

When the project is going live branch Main and create a Release branch. This new branch will be the one you’ll use to promote code to the production environment. Development of new features and bug fixes is done on the Main branch.

When a development is done, or a bug fixed we will merge the changeset (or changesets) to the Release branch. Yes, we will be doing cherry picking. I know it has bad reputation, but we depend on the customer validating the changes…

On our projects we have at least 2 Tier 2+ environments. We use one for testing and validation to which we deploy the DP created from the Main branch. The other Tier 2+ environment is the one we use for user testing and deploy to production. This second environment will be updated with the DP from the Release branch.

Dev – Main – Release

This is something we’ve been doing lately trying to emulate Git branches for development. We’re using a Dev branch for every developer. We work on our Dev branch, we do all the check-ins we want to do during a development and when it’s done we merge all the changesets or all the branch in a single changeset to Main. Finally we Forward Reverse Integrate Main into our Dev branch to get the changes from other developers.

Yes, it does involve a bit more of merging on the Dev – Main part. But the idea behind this strategy is having a clean list of single changesets in our Main branch for each development. Why? Because… cherry picking…

We will work with Dev and Main until Go Live, when we’ll branch the Main branch and create the Release one. The Tier 2+ environments will be serviced in the same manner as with the Main – Release strategy.

As I said the idea is having a clean list of changesets to move developments from Main to Release and solve all merging conflicts in the Dev branches. Each developer is responsible of his branch and resolving conflicts there.

We’ve been working for some months with this strategy and the results are OK and we’re not having issues regarding too many management. In the future we’ll try with Git, but Juanan will explain that part!

General advice

First of all: train yourself and your team. Remember, using a VCS is mandatory, this is part of your job now. Find somebody that can help even if he/she is outside the AX world. The problems of software development are more or less the same regardless of the language we use.

Don’t keep pending changesets to be merged forever. The amount of merge conflicts that will appear is directly proportional to the time the changeset has been waiting to be merged.

Remember to Keep it simple, stupid (or Keep it stupid simple), KISS. Don’t follow blindly what a guy on the Internet is telling you because he might have different needs in his projects than you.

So this is how we do branching at Axazure. Are there better ways of doing it? Sure! Can this be improved? I have no doubts about it. But this works for us, which is the important thing.

Azure hosted build for Dynamics 365 Finance & SCM

Behold #XppGroupies! The day we’ve been waiting for has come! The Azure hosted builds are in public preview with PU35!! We can now stop asking Joris when will this be available, because it already is! Check the docs!

I’ve been able to write this because, thanks to Antonio Gilabert, we’ve been testing this at Axazure for a few months with access to the private preview. And of course thanks to Joris for inviting us to the preview!

Azure hosted build
Riding the Azure Pipelines by Caza Pelusas

What does this mean? We no longer need a VM to run the build pipelines! Nah, we still need! If you’re running tests or synchronizing the DB as a part of your build pipeline you still need the VM. But we can move CI builds to the Azure hosted agent!

You can also read my full guide on MSDyn365FO & Azure DevOps ALM.

Remember this is a public preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited you should see a new LCS project called PEAP Assets, and inside its Asset Library you’ll find the nugets in the Nuget package section.

Continue reading “Azure hosted build for Dynamics 365 Finance & SCM”

LCS DB API: automating Prod to Dev DB copies

The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.

Using the LCS DB API
Using the LCS DB API

You can learn more about the LCS DB REST API reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:

You can also read the full guide on MSDyn365FO & Azure DevOps ALM.

And remember: this is currently in private preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited to the Yammer organization you can ask to join the “Self-Service Database Movement / DataALM” group where you’ll get the information to add yourself to the preview and enable it on LCS.

Continue reading “LCS DB API: automating Prod to Dev DB copies”

Secure your Azure Pipelines with Azure Key Vault

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

After the update of my last post about calling the LCS API from Azure DevOps Pipelines I thought that creating a pipeline with a password in plain sight was not very secure. How could we add extra security to a pipeline? Once again we can turn to an Azure tool to help us, the Azure Key Vault.

Azure Key Vault

A Key Vault is a service that allows us to safely store certificates or secrets and later use them in our applications and services. And like many other Azure services it has a cost but it’s really low and, for a normal use, you will be billed like a cent or none a month. Don’t be stingy with security!

Continue reading “Secure your Azure Pipelines with Azure Key Vault”

Calling the LCS Database Movement API in your Azure DevOps pipeline

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

I talked about the LCS Database Movement API in a post not long ago, and in this one I’ll show how to call the API using PowerShell from your Azure DevOps Pipelines.

What for?

Basically, automation. Right now the API only allows the refresh from one Microsoft Dynamics 365 for Finance and Operations environment to another, so the idea is having fresh data from production in our UAT environments daily. I don’t know which new operations the API will support in the future but another idea could be adding the DB export operation (creating a bacpac) to the pipeline and having a copy of prod ready to be restored in a Dev environment.

Continue reading “Calling the LCS Database Movement API in your Azure DevOps pipeline”

DevOps ALM automation in Microsoft Dynamics 365 for Finance and Operations

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

I’ve already written some posts about development Application Lifecycle Management (ALM) for Dynamics 365 for Finance and Operations in the past:

The possibility of doing real CI/CD is one of my favorite MSDyn365FO things, going from “What’s source control?” to “Mandatory source control or die” has been a blessing. I’ll never get tired of saying this.

Continue reading “DevOps ALM automation in Microsoft Dynamics 365 for Finance and Operations”

Set up the new Azure DevOps tasks for Packaging and Model Versioning

You can read my complete guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

During this past night (at least it was night for me :P) the new tasks for Azure DevOps to deploy the packages and update model versions have been published:

There’s an announcement in the Community blogs too with extended details on setting it up. Let´s see the new tasks and how to set them up.

Continue reading “Set up the new Azure DevOps tasks for Packaging and Model Versioning”

Using Azure Application Insights with MSDyn365FO

First of all… DISCLAIMER: think twice before using this on a productive environment. Then think again. And if you finally decide to use it, do it in the most cautious and light way.

Why does this deserve a disclaimer? Well, even though the docs state that the system performance should not be impacted, I don’t really know its true impact. Plus it’s on an ERP. On one in which we don’t have access to the production environment (unless you’re On-Prem) to certify that there’s no performance degradation. And because probably Microsoft’s already using it to collect data from the environments to show up in LCS, and I don’t know if it could interfere on it. A lot of I-don’t-knows.

Would I use it on production? YES. It will be really helpful in some cases.

With that said, what am I going to write about that needs a disclaimer? As the title says, about using Azure Application Insights in Microsoft Dynamics 365 for Finance and Operations. This post is just one of the “Have you seen that? Yeah, we should try it!” consequences of Juanan (and on Twitter, follow him!) and me talking. And the that this time was this post from Lane Swenka on AX Developer Connection. So nothing original here 🙂

Azure Application Insights

I spy
I spy!! Made by Cazapelusas

What’s Application Insights? As the documentation says:

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your blah web application. It will blah blah detect blaaah anomalies. It blah powerful blahblah tools to bleh blah blih and blah blah blaaaah. It’s blaaaaaaaah.

Mmmm… you better watch this video:

So much misery and sadness in the first 30 seconds…

Monitoring. That’s what it does and is for. “LCS already does that!“. OK, extra monitoring! Everybody loves extra, like in pizzas, unless it’s pinneapple, of course.

Getting it to work

The first step will be to create an Application Insights resource on our Azure subscription. Regarding pricing: the first 5GB per month are free and data will be retained for 90 days. More details here.

Then we need the code. I’ll skip the details in this part because it’s perfectly detailed in the link above (this one), just follow the steps. You basically need to create a DLL library to handle the events and send data to AAI and use it from MSDyn365FO. In our version we’ve additionally added the trackTrace method to the C# library. Then just add a reference to the DLL in your MSDyn365FO Visual Studio project and it’s ready to use.

What can we measure?

And now the interesting part (I hope). Page views, capture errors (or all infologs), batch executions, field value changes, and anything else you can extend and call our API methods.

For example, we can extend the FormDataUtil class from the forms engine. This class has several methods that are called from forms in different actions on the data sources, like validating the write, delete, field validations, etc… And also this:

modifiedField in FormDataUtils

This will run after a form field value is modified. We’ll extend it to log which field is having it’s value changed, the old and new value. Like this:

Extending modifiedField
I promise I always use labels!

And because the Application Insights call will also store the user that triggered the value change, we just got a new database log! Even better, we got a new database log that has no performance consequences because there’s no extra data to be generated on MSDyn365FO’s side. The only drawback in this is that it will only be called from forms, but it might be enough to monitor the usage of forms and counter the “no, I haven’t changed any parameter” 🙂

This is what we get on Azure’s Application Insights metrics explorer:

Azure Application Insights Custom Event
What dou you mean I changed that?!

Yes you did, Admin! Ooops it’s me…

Custom events

We’re storing the AOS name too and if the call was originated in a Batch.

All the metrics from our events will display on Azure and the data can be displayed later in Power BI, if you feel like doing it.

With this example you can go on and add calls to the extended objects where you need it. Batches, integrations, critical processes, etc…

Again, please plan what you want to monitor before using this and test it. Then test it again, especially on SAT environments with Azure SQL databases which perform a bit different than the regular SQL Server ones.

And enjoy the data!