Setup Entity Store’s export to Azure Data Lake storage

It’s easy to start this post, because many people can ask:

What’s a Data Lake?

Fishing in a Data Lake. By cazapelusas.

A Data Lake is not an Azure product but a term referring to a place where data is stored, regardless of whether it’s structured or unstructured. Its only purpose is storing the data ready to be consumed by other systems. It’s like a lake that stores the water of its tributaries, but instead of water with data.

In Azure the Data Lake is a Blob storage which holds the data. And this data can come from Microsoft Dynamics 365 for Finance or Supply Chain Management (I’ll go crazy with the name changes of Axapta 7) or from other sources.

Currently, and since PU23, #MSDyn365FO (#MSDyn365F ? or #MSDyn365SCM ?) officially supports exporting the Entity Store to Azure Data Lake storage Gen1, but compatibility with Data Lake Storage Gen2 is on the works in a private program with Data Feeds that will allow us to export entities and tables (YES!) in near real time. If you want to know more check the Data Management, Data Entities, OData and Integrations Yammer group in the Insider Program (if you still haven’t joined, you should).

Comparison vs. BYOD

The first thing we must notice is the price. Storage is cheaper than a database, even if it’s a single SaaS DB on Azure SQL. For example, a 1GB Blob storage account on Azure costs $21.6/month.

And the simplest Gen 4 with 1 vCore Azure SQL database costs $190.36/month. Almost 10 times more.

And what about performance? This comes from observation, not a real performance test, but data is transferred real fast. And it’s fast because in a Data Lake data is sent raw, there’s no data transformation until it’s consumed (ETL for a DB, ELT for a Data Lake) so there’s less time spent until data reaches its destiny. This doesn’t have a real impact for small sets of data but it does for large ones.

Setup

The process to export the Entity Store to a Data Lake is pretty simple and it’s well documented (but not updated) on the docs. I’ll explain step by step.

Create a storage account on Azure

On Azure go to or search in the top bar for Storage accounts and add a new one with a setup like the one in the pics below:

Make sure to disable Gen2 storage:

And you can go to review & create. When the account is ready go to Access Keys and copy the connection string:

Azure Key Vault

The next step is creating a Key Vault. For this step you need to select the same region as your Dynamics 365 instance:

When the Key Vault is ready go to the resource and create a new secret. Paste the connection string from the storage account into the value and press create:

Create an AAD App Registration

Now we’ll create an AAD App. Give it a name, select the supported account types you need and fill the URL with the base URL of your #MSDyn365FO instance:

Click register and now we must add the Azure Key Vault API to the app as in the image below:

Select the API and add the delegated user_impersonation permission:

Don’t forget to press the button you can see above to grant privileges (must be done by an Azure admin). Now go to secrets and create a new one, give it a name and copy the secret value. When you close the tab you will not be able to recover that secret anymore so copy it and save it somewhere until we need it.

Setup the Key Vault

Go back to the Key Vault we created in the second step and go to Access policies. Add a new one:

You have to select Get and List for Key and Secret permissions:

Now press Select principal and here add the AAD App created in the third step:

Add it and don’t forget to save in the access policies screen!!

Set up MSDyn365F… and O or and SCM or whatever its name is this month

Navigate to System Administration -> Setup -> System parameters and go to the Data Connections tab. Here there’s 4 fields for the key vault. The Application ID field corresponds to the Application ID of the AAD App (pretty obvious) and the Application Secret is the secret from the AAD App. This part is easy and clear.

The DNS name is the url on your Key Vault and the Secret name field is the name of your Key Vault’s secret where you pasted the storage account connection string.

Once all these fields are complete you can press Test Azure Key Vault and Test Azure Storage and, if you followed all steps correctly, you should see the following messages:

If any of the validations don’t succeed I’d just delete all resources and start from scratch, probably a secret mismatch.
Now, the two buttons you see next to the setup fields:
  • Enable Data Lake integration: will enable the full push of the entity store to the storage account you have just created and which is the main purpose of this post.
  • Trickle update Data Lake: will make updates after data is changed (Trickle Feed).

Setup Entity Store

Now we just need to go to the Entity Store (under System Administration -> Setup -> Entity Store) and enable the refresh of the entities we’d like to hydrate the Data Lake (I love this, it looks like it’s the correct technical word to use when feeding the Data Lake):

And done, our data is now being pushed to an Azure Blob:

The entities are saved each in a folder, and inside each folder there another folder for each measure of that entity and a CSV file with the data in it.

Now this can be consumed in Power BI with the blob connector, or feed Azure Data Factory or whatever you can think about, because that’s the purpose of the Data Lake.

 

Set up the new Azure DevOps tasks for Packaging and Model Versioning

During this past night (at least it was night for me :P) the new tasks for Azure DevOps to deploy the packages and update model versions have been published:

There’s an announcement in the Community blogs too with extended details on setting it up. Let´s see the new tasks and how to set them up.

Update Model Version task

This one is the easiest, just add it to your build definition under the current model versioning task, disable the original one and you’re done. If you have any filters in your current task, like excluding any model, you must add the filter in the Descriptor Search Pattern field using Azure DevOps pattern syntax.

Create Deployable Package task

This task will replace the Generate packages from the current build definitions. To set it up we just need to do a pair of changes to the default values:

X++ Tools Path

This is your build VM’s physical bin folder, the AosService folder is usually on the unit K for cloud-hosted VMs. I guess this will change when we go VM-less to do the builds.

Update!: the route to the unit can be changed for $(ServiceDrive), getting a path like $(ServiceDrive)\AOSService\PackagesLocalDirectory\bin.

Location of the X++ binaries to package

The task comes with this field filled in as $(Build.BinariesDirectory) but this didn’t work out for our build definitions, maybe the variable isn’t set up on the proj file. After changing this to $(Agent.BuildDirectory)\Bin the package is generated.

Filename and path for the deployable package

The path on the image should be changed to $(Build.ArtifactStagingDirectory)\Packages\AXDeployableRuntime_$(Build.BuildNumber).zip. You can leave it without the Packages folder in the path, but if you do that you will need to change the Path to Publish field in the Publish Artifact: Package step of the definition.

Add Licenses to Deployable Package task

This task will add the license files to an existing Deployable Package. Remember that the path of the deployable package must be the same as the one in the Create Deployable Package task.

And you’re done! A step closer from getting rid of the build VM.

If you need help setting up the release pipeline you can check this post I wrote.

Using Azure Application Insights with MSDyn365FO

First of all… DISCLAIMER: think twice before using this on a productive environment. Then think again. And if you finally decide to use it, do it in the most cautious and light way.

Why does this deserve a disclaimer? Well, even though the docs state that the system performance should not be impacted, I don’t really know its true impact. Plus it’s on an ERP. On one in which we don’t have access to the production environment (unless you’re On-Prem) to certify that there’s no performance degradation. And because probably Microsoft’s already using it to collect data from the environments to show up in LCS, and I don’t know if it could interfere on it. A lot of I-don’t-knows.

Would I use it on production? YES. It will be really helpful in some cases.

With that said, what am I going to write about that needs a disclaimer? As the title says, about using Azure Application Insights in Microsoft Dynamics 365 for Finance and Operations. This post is just one of the “Have you seen that? Yeah, we should try it!” consequences of Juanan (and on Twitter, follow him!) and me talking. And the that this time was this post from Lane Swenka on AX Developer Connection. So nothing original here 🙂

Azure Application Insights

I spy
I spy!! Made by Cazapelusas

What’s Application Insights? As the documentation says:

Application Insights is an extensible Application Performance Management (APM) service for web developers on multiple platforms. Use it to monitor your blah web application. It will blah blah detect blaaah anomalies. It blah powerful blahblah tools to bleh blah blih and blah blah blaaaah. It’s blaaaaaaaah.

Mmmm… you better watch this video:

So much misery and sadness in the first 30 seconds…

Monitoring. That’s what it does and is for. “LCS already does that!“. OK, extra monitoring! Everybody loves extra, like in pizzas, unless it’s pinneapple, of course.

Getting it to work

The first step will be to create an Application Insights resource on our Azure subscription. Regarding pricing: the first 5GB per month are free and data will be retained for 90 days. More details here.

Then we need the code. I’ll skip the details in this part because it’s perfectly detailed in the link above (this one), just follow the steps. You basically need to create a DLL library to handle the events and send data to AAI and use it from MSDyn365FO. In our version we’ve additionally added the trackTrace method to the C# library. Then just add a reference to the DLL in your MSDyn365FO Visual Studio project and it’s ready to use.

What can we measure?

And now the interesting part (I hope). Page views, capture errors (or all infologs), batch executions, field value changes, and anything else you can extend and call our API methods.

For example, we can extend the FormDataUtil class from the forms engine. This class has several methods that are called from forms in different actions on the data sources, like validating the write, delete, field validations, etc… And also this:

modifiedField in FormDataUtils

This will run after a form field value is modified. We’ll extend it to log which field is having it’s value changed, the old and new value. Like this:

Extending modifiedField
I promise I always use labels!

And because the Application Insights call will also store the user that triggered the value change, we just got a new database log! Even better, we got a new database log that has no performance consequences because there’s no extra data to be generated on MSDyn365FO’s side. The only drawback in this is that it will only be called from forms, but it might be enough to monitor the usage of forms and counter the “no, I haven’t changed any parameter” 🙂

This is what we get on Azure’s Application Insights metrics explorer:

Azure Application Insights Custom Event
What dou you mean I changed that?!

Yes you did, Admin! Ooops it’s me…

Custom events

We’re storing the AOS name too and if the call was originated in a Batch.

All the metrics from our events will display on Azure and the data can be displayed later in Power BI, if you feel like doing it.

With this example you can go on and add calls to the extended objects where you need it. Batches, integrations, critical processes, etc…

Again, please plan what you want to monitor before using this and test it. Then test it again, especially on SAT environments with Azure SQL databases which perform a bit different than the regular SQL Server ones.

And enjoy the data!