Priority-based throttling for Dynamics 365 integrations

We’re finally getting a throttling functionality for OData integrations!

It’s one of the most common requirements from a customer: the need to integrate Dynamics 365 with other systems. With the (back in the day) new AX7 we got a new way of integrating using the OData REST endpoint and the exposed Data Entities.

But integrations using the OData endpoints have low performance and for high-volume integrations it’s better to use the Data management package REST API. A (not so) high volume usage of the OData REST API will translate into performance issues.

The throttling functionality is in preview starting version 10.0.13 which is currently in PEAP. It will be enforced starting April 2021. You can join the Data Management, Data Entities, OData and Integrations Yammer group for more info. Remember you need to join the Insider Program for Dynamics 365 first to be able to access the Yammer group.

If you want to learn more about OData and throttling you can check these resources:


Why are we getting this functionality? Throttling is a well-known way of controlling the execution of some processes, and it’s present in almost all of the REST API out there. But it wasn’t for us.

The problem with OData is that when you make more calls than the endpoints can process, the system performance will be impacted because the resources that run the OData endpoints and the rest of the system are shared. And this is true for interactive sessions (real people using MSDyn365FO) and non-interactive sessions.

And here’s where throttling will help! With throttling we’ll be able to set priorities for different integrations to keep the responsiveness of the system.

Configure priorities

To configure the priorities we need to go to System administration – Setup – Throttling priority mapping where we’ll see this:

Priority-based throttling
Priority-based throttling
Priority-based throttling for Dynamics 365 integrations 1
Throttling authentication types

If we open the Authentication type field we see there’s 2 different types: AAD application and User. The AAD type will apply to integrations that access FnO using the AAD client Id + secret (OData, custom web services), and the User type applies only to specific users. For example, if some user needs to periodically import data using the Excel add-in we can set a high priority for that single user. For example:

Throttling configuration
Throttling configuration

In the image above we can see there’s a AAD integration with Low priority and two Users with Medium and High priority. If the system gets to the limit where it needs to throttle integrations the user JULIA will have the highest priority to keep using OData, then the user JOHN and finally the AAD integration.

This means that the AAD integration will be throttled first, then JOHN and finally JULIA because she’s responsible for some business critical processes.

Throttled requests

Until now we had no way of notifying the integration that the request was being throttled. I don’t remember but you’d get a timeout or an error 503, no more details. With this new feature we can answer to a throttled request with an error code 429 Too Many Requests:

This will finally shed some light on the errors and we’ll be able to tell the other side to try again after some time. And instead of just trying again, and again, and again, because we got no specific error, and make the problem a bigger one by sending more requests, we can change the logic of the integration and retry in X seconds/minutes.

Monitoring throttling in LCS

Remember you can monitor throttling from the LCS environment page:

Priority-based throttling for Dynamics 365 integrations 2
Throttling monitoring from LCS

What is throttling not?

Please, let me steal one of the slides of the tech talk because we need to remember something:

What is throttling not?
What is throttling not?

I want to stress the second point: Throttling is not a solution to bad entity design, code… Throttling is a tool to prevent an issue and notify integrations that the requests are being throttled, but it won’t fix performance issues born in the design or code phases.

As I said at the beginning of the post, if you still haven’t go and watch the Throttling Overview for Fin/Ops Integrations.

Self-service & SSRS: print reports as PDF in your Dev VM

If you’re working with the (not so) new self-service Tier 2 environments in Dynamics 365 for Finance and Operations you might have already noticed this: the reports in Tier 2+ and production environments aren’t using the SSRS report viewer, instead they’re being displayed in a beautiful PDF preview form.

But what happens on your development box?

If you want to know more about self-service environments you can read these posts I wrote a while back:

Report viewer on your Dev VM

If you try to print a report from your dev VM/OneBox, you will still see the old HTML SSRS report viewer instead of the self-service preview PDF viewer. This is totally expected and well documented.

But why should I care about this?

To me the reason is one and only one: the way the SSRS report is rendered in the HTML viewer and the PDF preview is different, and sometimes REALLY different. And this can be a problem.

Imagine this: it’s Monday, a new week starts, you’re fresh and full of energy, and you’re the lucky one in the team that got assigned a new report! “Wow, that’s a nice way to start the week” says NOBODY.

Me when I'm assigned a SSRS report
Me when I’m assigned a SSRS report

Sorry, I had to let this go. So you spend many hours changing a report, previewing it, checking that all the 40 damn fields of the line won’t cause a page break, and done. You check it in and the next day somebody tells you that, on UAT, data is being printed in 2 separate pages. Whyyyy?

Well, that’s because the SSRS and PDF viewers don’t render the report the same way. It should be pretty much the same but in the end it’s not.

Fixing your dev box

As I said earlier this is well documented, but when we got into the private self-service preview it wasn’t. Hopefully my colleague Ferni Tudela found that in the SrsReportRunUtil class there was a method that checked if the current environment was self-service:

A workaround was to extend this method and return always true and your dev box would become a fake service fabric VM.

But now there’s a better way to do this, simply navigate to your environment URL and append &mi=SysReportAdministration to it. This will open the report options form where you need to enable the first checkbox:

Report options
Report options

Restart IIS and you’re ready to start using the new PDF preview viewer. I hope this will save some time if you’ve been assigned a report.

Add a Menu Item to a SysOperation dialog

A short one! Some time ago I explained how to add a multi selection lookup to a SysOperation dialog and in this post I’ll explain how to add a Menu Item as a Function button to the SysOperation dialog.

Before the SysOperation Framework was introduced in AX2012, we used the RunBase Framework, and maybe doing these things looked easier/quickier with RunBase because all the logic was in a single class. But in the end what we need to do is practically the same but we have to do it in the UIBuilder class.

Let me show you and explain all the code. I’ll only show the DataContract and UIBuilder classes as they’re the only important ones in this case.

DataContract class

Please don’t look at all the hardcoding 😛

On the DataContract class I’ve defined a VendAccount member. I’ve also created a group using the SysOperationGroup attribute and put the VendAccount field inside using the SysOperationGroupMember attribute in the parm method.

The UIBuilder class is also set in the SysOperationContractProcessing attribute.

UIBuilder class

All the things we need to do to modify the dialog of the SysOperation Framework must be done in the UIBuilder class. What we need to is the same we would have done using the RunBase framework. Get the controls from the dialog and modify or add its elements.

We get the vendor group we created in the Data Contract class and add a control to it. This control is of type MenuFunctionButton and once it’s created we define it’s properties and it’s done.

Add a Menu Item to a SysOperation dialog 3
A Menu Item button in a SysOperation dialog!

Azure functions & Dynamics 365 Finance and Operations

This is another post about solving Dynamics 365 problems using external tools. However I’m starting to think as everything Azure-related as not external at all. In this case I’ll show different scenarios using Azure functions with Dynamics 365.

I wrote this almost three weeks ago and it was intended as a two-part post but after seeing this wonderful blog post about Azure Functions from Fabio Filardi I don’t know what else could I add to it and will probably leave it here. Go check it!

In this post we’ll see what Azure Functions are and how to create one locally and deploy it to Azure.

Azure functions

Azure functions are a serverless compute service that allow us to run code without deploying it to any server (because it’s serverless). It’s a wonderful tool that allows us to write code using .NET, JavaScript or Java, deploy it and run it in seconds.

The functions can be triggered by different events like HTTP, queues, Azure Event Hub and Grid or Service Bus, amongst others. These events will make the function run our code. Let me show you how to quickly create, deploy and run an Azure Function triggered by a POST HTTP call.

Note: Azure Functions can also be created from the Azure portal but using Visual Studio is more powerful (and easier in my my opinion) and will allow us to add the code to Azure DevOps.

Create a new Visual Studio project and select Azure Functions:

Azure Functions in Visual Studio 2019
Azure Functions in Visual Studio 2019

Give it a name and in the triggers select the Http trigger:

Azure Functions triggers
Azure Functions triggers

The solution will open with some sample code. We can run this code locally just by pressing F5:

Azure Functions running locally
Azure Functions running locally

You can see that after running the code I get a local URL I can query to trigger it. We can do this using Postman.

A quick trick!

But what if we needed to test this function from an external service that can’t make an HTTP request to our localhost? Let me show a little trick Juanan showed me: use ngrok.

ngrok will create a tunnel and give you a public URL to your machine to call the function. Download it, unzip it and put ngrok.exe where you want it to run. Then open a command prompt and run this:

Where 7071 should be the same port your function is running on. You’ll get this:

Azure functions & Dynamics 365 Finance and Operations 4
Azure Functions and ngrok, the prefect couple

Now you could call your local function from anywhere using the public URL ngrok has generated. You’ll see all the calls made in the prompt. ngrok is just great! I love it!

Calling the function

First, let’s check the code:

This function accepts GET and POST calls, and if you pass a value with the ‘name’ query parameter in the URL you will get a different response than if you pass nothing.

I’ll call the local URL passing a parameter and see what we get:

Azure functions & Dynamics 365 Finance and Operations 5
Azure functions response

Regarding code, we can do exactly the same things we’d do in any other .NET project for example. We can add all sorts of libraries using nuget, create additional classes and use them in the function, etc. Azure functions can help us solve plenty of problems, and its cost is ridiculous! With the free tier we have 1 million executions included, FREE! The only cost you’d incur into would be the storage account that’s created, but we’re talking about cents/month.

Deploy the function

Azure functions & Dynamics 365 Finance and Operations 6
I’m right-clicking publishing!

To deploy the function we need to create it in Azure portal first. Once it’s done go to Visual Studio and right click, publish the function.

I think this is the only time we can right click, publish without anybody wanting to kill us.

Now select a Consumption plan and “Select existing”:

Azure functions & Dynamics 365 Finance and Operations 7

Create the profile and you’ll be asked to select an Azure subscription and resource group. Select the subscription where you’ve created the Function App and its resource group:

Azure functions publish
Azure functions publish

And finally click the Publish button and your function will be deployed to Azure.

Publish the function!
Publish the function!

Go to the Azure portal, select the function you created before and go to “Functions”:

Azure functions & Dynamics 365 Finance and Operations 8

There you’ll see your function, select it and in the new window click on “Get Function Url”:

Azure functions & Dynamics 365 Finance and Operations 9

Copy the function URL and call it from Postman, or a browser as it also accepts GET requests, and pass it the name parameter too:

Azure functions & Dynamics 365 Finance and Operations 10

It works! We’ve created the Azure Function in a local solution in Visual Studio, tested it locally and finally deployed it to Azure and tested it again, now in the cloud. Aren’t Azure functions cool? Azure Functions can also be secured using AAD, Azure Key Vault, you can monitor them using Application Insights and many other options.

One final remark: when we deploy an Azure Function like we’ve done we cannot edit the code on Azure. We need to do the changes in Visual Studio and deploy it again.

Creating (more) community, or trying

I’m sorry for my English-speaking readers because, maybe, this post will be a bit useless for you as all the content I’ll talk about is in Spanish. But it’s always good to know!

In the last few days I’ve taken part in a community event, the 365 Saturday online, and I’ve also started a podcast. I want to talk a bit about this.

Dynamics Power Spain Online 2020

This has been my fourth participation as a speaker in the last three years and as usual I’ve presented a session with Juanan. This time we’ve talked about using Azure DevOps with Microsoft Dynamics 365 for Finance and Operations.

It’s a topic I write about a lot, but we really think there’s still many people using it in a wrong way or just using the source control part. And that’s bad!

You can watch our session below, and as I said before, it’s only in Spanish and I think there’s no subtitles from Youtube.

There’s more sessions from other Axazure colleagues, again in Spanish:

Many Axazure sessions as you can see. That’s what happens when you promote sharing with the community! You can see the other sessions in 365 Saturday Madrid‘s Youtube channel, some are in English!, the podcast!

Yes! Juan Antonio Tomás and myself have started a podcast in 2020! Why? Each time there’s a new feature for MSDyn365FO, an announcement, a preview, whatever we spend some time talking about it and what could we do with it. So we thought “Why don’t we record this?”. And now we have a podcast which you can listen, even though it’s also in Spanish…

You can listen the first episode here!

Visit us at!

There’s another reason to do this: the Finance and Operations technical community in Spain. It makes me terribly jealous to see the strength of Dynamics 365 CD/Power Platform, .NET, Azure and other technical communities. We don’t have this for AX and that’s what we want!

We would like to have a bigger technical community! This is how we’ll try to encourage other people, sharing what’s coming for FnO. We of course accept collaborations, and if anybody wants to be interviewed or participate we’re totally open!


The idea of community we have is something really simple and I think it’s something we’ve learnt from Antonio and the essence of El Rincón Dynamics. A free and collaborative place where anybody can learn and share and connect. It’s very easy, right?

There might even be people that doesn’t fully understand why do we share what we know freely, instead of keeping everything to us, because that decreases our personal value, we’re sharing our secrets. And this way of thinking is so, so, so much wrong! I might share what I know, but I also have over 10 years’ experience behind me, the mix of these two things is what my value is.

I can only see positive things in sharing.

How do we do branching in Dynamics 365?

I’ve written this post after Mötz Jensen asked to in a long and really interesting Twitter discussion on branching and version control in Dynamics 365 for Finance and Operations. This is Denis Trunin‘s tweet that initiated it all:

Just go and read all the replies, there’s some of the most interesting content I’ve read on Twitter in months.

Branching in MSDyn365FO
Branching in MSDyn365FO

You can also check my MSDyn365FO & Azure DevOps ALM guide too!


When deciding which branching strategy to use, you have to think what will suit your team and follow the KISS principle. Keep your branching strategy as simple as you can! You don’t want to spend most of your day managing the code instead of coding, right? Or trying to fix bad merges…

Also keep in mind that your strategy might not be the same if you’re working on a a customer implementation project or developing an ISV.

Main – Release

This one of the most simple strategies. You start your work on the Main branch, all developers working with it. You keep working only with Main until Go-live.

When the project is going live branch Main and create a Release branch. This new branch will be the one you’ll use to promote code to the production environment. Development of new features and bug fixes is done on the Main branch.

When a development is done, or a bug fixed we will merge the changeset (or changesets) to the Release branch. Yes, we will be doing cherry picking. I know it has bad reputation, but we depend on the customer validating the changes…

On our projects we have at least 2 Tier 2+ environments. We use one for testing and validation to which we deploy the DP created from the Main branch. The other Tier 2+ environment is the one we use for user testing and deploy to production. This second environment will be updated with the DP from the Release branch.

Dev – Main – Release

This is something we’ve been doing lately trying to emulate Git branches for development. We’re using a Dev branch for every developer. We work on our Dev branch, we do all the check-ins we want to do during a development and when it’s done we merge all the changesets or all the branch in a single changeset to Main. Finally we Forward Reverse Integrate Main into our Dev branch to get the changes from other developers.

Yes, it does involve a bit more of merging on the Dev – Main part. But the idea behind this strategy is having a clean list of single changesets in our Main branch for each development. Why? Because… cherry picking…

We will work with Dev and Main until Go Live, when we’ll branch the Main branch and create the Release one. The Tier 2+ environments will be serviced in the same manner as with the Main – Release strategy.

As I said the idea is having a clean list of changesets to move developments from Main to Release and solve all merging conflicts in the Dev branches. Each developer is responsible of his branch and resolving conflicts there.

We’ve been working for some months with this strategy and the results are OK and we’re not having issues regarding too many management. In the future we’ll try with Git, but Juanan will explain that part!

General advice

First of all: train yourself and your team. Remember, using a VCS is mandatory, this is part of your job now. Find somebody that can help even if he/she is outside the AX world. The problems of software development are more or less the same regardless of the language we use.

Don’t keep pending changesets to be merged forever. The amount of merge conflicts that will appear is directly proportional to the time the changeset has been waiting to be merged.

Remember to Keep it simple, stupid (or Keep it stupid simple), KISS. Don’t follow blindly what a guy on the Internet is telling you because he might have different needs in his projects than you.

So this is how we do branching at Axazure. Are there better ways of doing it? Sure! Can this be improved? I have no doubts about it. But this works for us, which is the important thing.

Is the CDS the future of Finance and Operations apps?

Adrià the medium: what will happen with the CDS?
Adrià the seer: what will happen with the CDS?

After the MBAS on Wednesday I’m thinking about this more and more. Will Dynamics 365 for Finance and Supply Chain Management’s data be natively hosted in the CDS?

After watching Ryan Jones’ session “What’s new in the Common Data Service“, I ask myself whether that’s the question or it should be when will it be natively available in the Common Data Service?

The Common Data Service

The CDS is a platform that allows us to store data that will be used by the business applications. But it’s not only that, take a look at this picture:

CDS (screenshot from Ryan Jones session on MBAS)

We could put MSDyn365FO on top of all that, it supports relational databases, storage, reporting, workflows, security, etc… Of course that wouldn’t be an overnight switch but maybe something progressive. Like what we’ll have with the FnO virtual entities on CDS!

With virtual entities we still won’t have Finance and SCM data on CDS because virtual entities:

Virtual entities enable the integration of data residing in external systems by seamlessly representing that data as entities in Common Data Service, without replication of data and often without custom coding.

“Without replication of data”. When you access a virtual entity in the Common Data Service its state is dynamically retrieved from the external system.

The CDS capabilities
CDS + Operations (screenshot from Ryan Jones session on MBAS)

As you can see in the image all public data entities will be natively in CDS. This means we can use the Power Platform capabilities for Finance and Operations as fast and easy as our Customer Engagement colleagues do. At least for the public data entities.

If we need data to be physically in both places we’ll still need to use Dual Write. Remember Dual Write synchronizes data between Finance and Operations and Customer Engagement/CDS near real time.

CDS + Operations: Under the Hood
CDS + Operations: Under the Hood (screenshot from Ryan Jones session on MBAS)

If you want to learn a bit more about Dual Write you can check the “And finally… Dual Write!” session Juan Antonio and I did on 2019 Dynamics 365 Saturday Madrid. It’s in Spanish and old, Dual Write has now many more Out-of-the-box functionalities, but it gives an idea of what it does and is capable of.

Will this ever happen?

Who knows, I’m just speculating, I’m a developer but I can’t stop thinking that Microsoft is investing a lot into CDS. And Finance and Operations Apps are the only Dynamics 365 products whose data does not reside on Common Data Service.

We’re seeing some functionalities from FnO being replicated and later extended into the CDS like Dynamics 365 Human Resources or Dynamics 365 Project Operations. This is creating an issue, because right now, you must create an integration between the two applications if you want to have some kind of data exchange. FnO in the Common Data Service would solve this.

This also creates some confusion to customers that think that this integration happens Out-of-the-box when it’s not. The naming of the product suggests that but it’s not happening.

We must think that this wouldn’t happen in the following year or two, or three. This is something in the long term. I don’t know about CDS Apps, but Dynamics 365 for Finance and SCM has quite a nice and large amount of tables and migrating all of them to the Common Data Service is sure a tremendous amount of work.

And what about the developer tools? That should change for sure too! We’ll see where the product and us as professionals are headed, but for sure we can’t think about Finance and Operations alone without the CDS anymore.

Azure hosted build for Dynamics 365 Finance & SCM

Behold #XppGroupies! The day we’ve been waiting for has come! The Azure hosted builds are in public preview with PU35!! We can now stop asking Joris when will this be available, because it already is! Check the docs!

I’ve been able to write this because, thanks to Antonio Gilabert, we’ve been testing this at Axazure for a few months with access to the private preview. And of course thanks to Joris for inviting us to the preview!

Azure hosted build
Riding the Azure Pipelines by Caza Pelusas

What does this mean? We no longer need a VM to run the build pipelines! Nah, we still need! If you’re running tests or synchronizing the DB as a part of your build pipeline you still need the VM. But we can move CI builds to the Azure hosted agent!

You can also read my full guide on MSDyn365FO & Azure DevOps ALM.

Remember this is a public preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited you should see a new LCS project called PEAP Assets, and inside its Asset Library you’ll find the nugets in the Nuget package section.

Continue reading “Azure hosted build for Dynamics 365 Finance & SCM”

LCS DB API: automating Prod to Dev DB copies

The new LCS DB API endpoint to create a database export has been published! With it we now have a way of automating and scheduling a database refresh from your Dynamics 365 FnO production environment to a developer or Tier 1 VM.

Using the LCS DB API
Using the LCS DB API

You can learn more about the LCS DB REST API reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:

You can also read the full guide on MSDyn365FO & Azure DevOps ALM.

And remember: this is currently in private preview. If you want to join the preview you first need to be part of the Dynamics 365 Insider Program where you can join the “Dynamics 365 for Finance and Operations Insider Community“. Once invited to the Yammer organization you can ask to join the “Self-Service Database Movement / DataALM” group where you’ll get the information to add yourself to the preview and enable it on LCS.

Continue reading “LCS DB API: automating Prod to Dev DB copies”

Compiler warnings: are you checking them in Dynamics 365?

Compiler warnings. Warnings. They’re not errors, only warnings. You can just overlook and forget them, right? Well, I hope you aren’t.

“But even the standard code done by Microsoft throws warnings!”, you could say. And that’s true, but that’s not your code, it’s Microsoft’s. If a functionality you’re using breaks because they didn’t care about their warnings, you can open a support request and it’s Microsoft’s job to fix it. If your code breaks some functionality because you didn’t care about a warning, it’s your job to fix it, and your customer will want it as fast as you’d want Microsoft to fix their error.

That’s why we should be warned about warnings (badum tss).

Continue reading “Compiler warnings: are you checking them in Dynamics 365?”