Sin categorizar


There’s a new version and a new task for our release pipelines that use the Azure-hosted agents. These changes have been introduced recently to support the new MSAL authentication libraries for the LCS service connection used to upload and deploy the deployable packages.

The current service connections use Azure Active Directory (Azure AD) Authentication Library (ADAL), and support for ADAL will end in June 2022.

This means that if we don’t update the Asset Upload and Asset Deployment to their new versions (1.* and 2.* respectively) the release pipelines could stop working after 30th June 2022.

Automating the update like...
Automating the update like…

Now that Microsoft will also update additional Dynamics 365 Finance and Operations Sandbox environments, partners and customers will only need to take care of updating cloud-hosted environments, as we’ve always done.

I’m sure each team manages this differently, maybe leaving it to each developer to update their VM, or there’s someone in the customer or partner side that will do it. That’s in the best cases, maybe nobody is updating the developer machines…

If you want to know more about builds, releases, and the Dev ALM of Dynamics 365 you can read my full guide on MSDyn365 & Azure DevOps ALM.

Today, I’m bringing you a PowerShell script that you can run in a pipeline that will automatically update all your developer virtual machines!

Need to get the price of an item that has a sales or purchase agreement? The PriceDisc class is here to save us!

PriceDisc: getting prices the right way 2
Trying to catch the best price using the PriceDisc framework

This is one of those reference posts that I’m writing for the Adrià of the future, because it’s something I forget about a lot.

PriceDisc magic!

There’s an obsolete method, I think it was findItemPriceAgreement, to get the price, but it’s obsolete, as I’ve just said. So the easiest way to get a price is to use the PriceDisc class that replaces the obsolete method.

To use it, just instantiate a PriceDiscParameters object and call all the parm methods you need. Finally, create a PriceDisc object using the newFromPriceDiscParameters method and passing the PriceDiscParameters, and… well take a look at the code below:

And that’s all. I know it’s a stupid post, but so am I and forget this kind of things.

It’s been some time since I wrote “Is Dataverse the future of Finance and Operations apps?“, and when I did, Dataverse was still called CDS and still went through several name changes.

Has anything changed since I wrote that post? Do I still see Dataverse as the future of Finance and Operations apps? Well, now we know some things for sure, and new functionalities have been rolled out.

A man with a crystal ball seeing the future of Dataverse
What’s in Dataverse…

Let me look at my crystal ball again and see what we can see inside…


We can think about convergence as the process that will bring is bringing Dynamics 365 Finance and Operations and the Power Platform closer, and making our life easier when wanting to integrate both.

If you want to learn more about FnO and Power Platform convergence, take a look at this:

And also these sessions on YouTube:

Keep an eye on the ANZD365 FinOpsTeam YouTube channel as they’re currently doing a series on convergence and more content will be streamed during the coming weekends.

Linking the ERP and Dataverse is easier than ever

I remember when I first set up Dual Write for an environment during the preview. Everything was manual and there was some part of trial and error in the process.

Have you deployed a Finance and Operations environment lately? There’s a section for Dataverse, and after marking a checkbox, it will automatically deploy a Dataverse environment that will be linked to the FnO one you’re deploying. Once it’s running, setting up Dual Write is peanuts!

And what about Virtual Entities? You can use them in your Power Platform solutions instead of using the Finance and Operations connector. This will also make Dataverse developers’ lives easier, as they will be able to access ERP data in a way they’re used to and know well.

Add-ins in Dataverse

All the add-ins that we can install to extend Finance and Operations are being installed in Dataverse. Need to install the Inventory Visibility or Export to Azure Data Lake add-ins? You’ll need to link a Dataverse environment first!

Will AxDB ever be in Dataverse?

I don’t think we’ll see that in the short or midterm, but maybe in a distant future… or maybe not. If you watch Sunil Garg’s session, he makes very clear that right now the ERP and CRM/Dataverse databases being in the same DB is not in the roadmap of the convergence. But at least we’re all living in the same elastic pool-type Azure SQL servers!

Then I think that given the transactional nature of the ERP it’s a bit difficult that this will happen. But who knows, because the products are evolving at such speed that maybe, one day, we’ll see it.

X++ developers

If you’re an X++ developer, you might be asking yourself: “With all this convergence stuff, should I start learning about Power Platform?“.

Of course, you should! But not because X++ will disappear, this is made crystal clear by Sunil Garg in his session in Pakistan UG “Finance Operation & Power Platform convergence roadmap” (minute 9:25).

But X++ developers need to stop thinking they’re ONLY X++ developers. Juanan and I have said it in several of our sessions, we’re not just X++ developers anymore. First, the move to Azure and now the irruption of the Power Platform, we need to think of all that as tools to complement our X++ customizations. And not only complement, in some cases even replace them, like using Logic Apps to connect to an FTP server.

You can read my complete ALM guide on Microsoft Dynamics 365 for Finance & Operations and Azure DevOps.

Moving data from the production to a sandbox environment is something we regularly have to do to have real updated data to do some testing or debugging. It’s a process that takes time and that can be automated as I explained in the post LCS DB API: automating Prod to Dev DB copies.

Me pressing the auto button to automagically import a data package
Me pressing the auto button to automagically import a data package

In this post, I’ll add an extra step to the database refresh: restore a data package (DP). Why? Because I’m sure we all need to change some parametrization or some endpoints in our test environments after a prod refresh.

You can learn more about the DMF REST API, which I’ll use, reading this post from Fabio Filardi: Dynamics 365 FinOps: Batch import automation with Azure Functions, Business Events and PowerBI.

You can learn more about the LCS DB REST API by reading these posts I wrote some time ago. You might want to read them because I’m skipping some steps which are already explained there:

In a past post, we learned how to create custom data entities to be used in Dual-write.

And now you might be asking yourself, how do I move the Dual-write table mappings to a test or production environment from the development environment? Do I need to repeat everything I’ve done on the dev machine in a Sandbox environment?

Fortunately, we don’t need to do it all manually again, we can use a Dataverse solution to copy the Dual-write table mappings between environments.

The Dataverse worm
The Dataverse worm

If you want to learn more about Dual-write you can:

It’s been a while since I first wrote about the Application Checker in 2019, and here I am again. In this blog post, I’ll talk about SocrateX and XQuery too, and I’ll also show how to generate the files and databases used to analyze the code.

Fake Socrates
This is a fakeX SocrateX

If you want to know more about App Checker or SocrateX, you can read these resources in addition to the post I’ve linked above: