Parse XML and JSON easily in MSDyn365FO

Some time ago I had to create an interface between MSDyn365FO and a web service that returned data as XML. I decided to use X++’s XML classes (XmlDocument,  XmlNodeList, XmlElement, etc…) to parse the XML and get the data. These classes are terrible. You get the job done but in an ugly way. There’s a better method to quickly parse XML or JSON in MSDyn365FO.

Continue reading “Parse XML and JSON easily in MSDyn365FO”

Feature management: create a custom feature

Feature management has been around in Microsoft Dynamics 365 for Finance and Operations for some time now. Before that features were enabled through flighting running a SQL query on dev and UAT boxes (and the DSE team would do it on production).

Now we have a nice workspace showing all the available features and flighting is still around too. The main difference between flighting and features is that flighting is enabled to a selected group of customers, like a preview of a feature.

Continue reading “Feature management: create a custom feature”

Set up the new Azure DevOps tasks for Packaging and Model Versioning

During this past night (at least it was night for me :P) the new tasks for Azure DevOps to deploy the packages and update model versions have been published:

There’s an announcement in the Community blogs too with extended details on setting it up. Let´s see the new tasks and how to set them up.

Continue reading “Set up the new Azure DevOps tasks for Packaging and Model Versioning”

Application Checker: enforcing better coding practices?

Unless you’ve been working for an ISV there’s a high percentage of probabilities that you’ve never cared about Dynamics Best Practices (BP), or maybe you have. I haven’t worked for an ISV myself but back when I started working with AX I was handed the development BP document and I’ve tried to follow most of them when writing code.

But BPs could be ignored and not implemented without any issue. This is why Microsoft will publish…

Application Checker

Application Checker is a tool that will change that. It will force some rules that our code will have to meet, otherwise the code won’t compile (and maybe won’t even deploy to the environments).

We got an advance of it during last MBAS session “X++ programming with quality” by Dave Froslie and Peter Villadsen. Unfortunately the session wasn’t recorded.

App checker is built on BaseX, an XML analysis tool, and powers Socratex which Microsoft uses to track code quality. I don’t know if Socratex will be publicly released and I don’t remember if this was clarified during the session.

The set of rules can be found in Application Checker’s GitHub project and it’s still WIP. I think there’s loooots of things to decide before this goes GA, and I’m a bit worried and afraid of some of the rules 😛

Rule types

There’s different types of rules, some will become errors and other warnings. For example:

ExtensionsWithoutPrefix.xq: this rule will throw an error avoiding your code to compile. It checks if the extension class has a name ending with _Extension and an attribute ExtensionOf. If it has it must have a prefix. E.g.: if we extend the class CustPostInvoice it can’t be named CustPostInvoice_Extension, it needs a prefix like CustPostInvoiceAAS_Extension.

SelectForUpdateAbsent.xq: this rule will throw a warning. When there’s a forUpdate clause in a select statement and no doUpdate, update, delete, doDelete or write is called later it will let us know.

As of today, there’s 21 rules in the GitHub project. You can contribute to the project, and you could enforce your own rules without sending them to the project on your dev boxes, just add them to the local rules folder. I’d create a rule that makes the space after an if/while/for/switch mandatory and throws an error otherwise, but that’s only a bit of my OCD when writing/reading code.

Try it on your code

We can already use Application Checker on our development environments since PU26, I think. We just need to install JRE and BaseX in the dev box and select the check when doing a full build.

Some examples

ComplexityIndentationCombined.xq

This query checks the (wait for it…) cyclomatic complexity of the methods. I’ll try to explain it… Cyclomatic complexity is a metric for software quality, and is the number of independent paths in the code. Depending on the number of ifs, whiles, sitches, etc… the code can have different outcomes through different paths, that’s what complexity calculates.

Taking this as an example, a dumb one but ignore it, just look at the amount of different paths that could happen:

class AASAppCheckerDemo
{            
    public void complexMethod()
    {
        int a, b, c, d, e, f, g, h, i;

        if (a)
        {
            if (d)
            {				
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
            else if (e)
            {
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
            else if(f)
            {                
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }
            }
        }
        else if (b)
        {
            if (d)
            {
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }
            }
            else if (e)
            {
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
            else if(f)
            {               
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
        }
        else if(c)
        {
            if (d)
            {                
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
            else if (e)
            {                
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }                
            }
            else if(f)
            {
                if (d)
                {
                }
                else if (e)
                {
                }
                else if(f)
                {
                }
            }
        }        
    }

}

In App checker the error appears when the complexity is over 30. I’ve used Lizard code complexity analyzer to calculate the complexity of the method below and I’m getting a 49.

The rule also checks for the indentation depth, failing if it’s greater than 2. In the end the purpose of both rules is to try to cut up long/large methods, which will also help in enabling more extension points in different places of our logic, like Microsoft did with Data Provider classes for reports.

BalancedTtsStatement.xq

This one gives me mixed feelings. The rule checks that the ttsbegin and the ttscommit of a method are in the same scope. So the following is not possible:

public void ttsCheck()
{
    StagingTable stagingTable;

    try
    {
        while select forupdate stagingTable
            where !stagingTable.Processed
        {
            ttsbegin;                
            boolean ret = this.doThings();

            if (ret)
            {
                stagingTable.Processed = true;
                stagingTable.update();
                ttscommit;
            }
            else
            {
                ttsabort;

                ttsbegin;
                stagingTable.Processed	= false;
                stagingTable.ErrorMsg	= 'An error ocurred, see log.';
                stagingTable.update();
                ttscommit;
            }
        }
    }
    catch (Exception::Error)
    {
        ttsabort;

        ttsbegin;
        stagingTable.Processed = false;
        stagingTable.update();
        ttscommit;
    }
    catch
    {
        ttsabort;

        ttsbegin;
        stagingTable.Processed = false;
        stagingTable.update();
        ttscommit;
    }
}

private boolean doThings()
{
    return true;
}

Imagine you’ve developed an integration with an external application that writes data to an intermediate table in MSDyn365FO and you process all pending data sequentially. You don’t want to throw an error if something goes wrong because you need the process to continue with the following record, so you ttsabort the wrong line, store the error and continue. If this is not possible… how should we do this? Create a batch that creates a task for each line to process?

Plus, the standard models have plenty of ttscommit inside if statements.

RecursiveMethods.xq

This rule will block the use of recursion on static methods. I don’t get why. Application checker should be a way to better coding practices, not forbidding some patterns. If somebody gets a recursive method to prod and the exit condition isn’t met… hello testing?

Some final thoughts

Will this force developers to code better? I don’t think so, but that’s probably not Application checker’s purpose. For centuries humans have found ways to bypass rules, laws and all kinds of restrictions and this won’t be an exception.

Will it help? Hell yes! But the best way to ensure code quality is promoting the best practices in your team, through internal trainings or code reviews. And even then if someone doesn’t care about clean code will keep on writing terrible code, which might work but won’t be beautiful at all.

Finally, I’m not sure about some rules, like avoiding recursion on static methods or the tts thing. We’ll just have to wait and see which rules make it to the final release and how will Application checker be finally implemented in the MSDyn365FO application lifecycle by blocking (or not) the deployments of code which doesn’t pass all the checks or if it will be included into the build process.

Self-service deployments: the future is here

Right now Microsoft Dynamics 365 for Finance and Operations has an old style monolithic architecture, even it’s now in Azure’s cloud, what we really have is a single (or multiple for Tier 2+ environments) VM that runs everything: the AOS/IIS, Azure SQL Server, the Batch service, MR, etc. Exactly the same as AX 2009/2012.

This is going to change in the coming months with the self-service deployments. We’ll move from the monolithic architecture to microservices that will run all the needed components with the help of Azure’s Service Fabric. MSDyn365FO will be on a real SAAS model.

Before starting let me clarify that all these changes will only apply to Microsoft-managed Tier 2+ environments: sandbox and production environments. The build environment (until it’s made obsolete) and the cloud-hosted environments on the customer or partner subscription will still be single VMs.

What’s new?

Faster deployments

When you deploy a new environment it will start deploying without waiting for Microsoft to do it (it’s self-service!). Additionally, thanks to the new microservices architecture, it will be ready to use in under 30 minutes compared to 6-8 hours of regular environments. The first time feels like…

Subscription estimator

We still need to fill out the subscription estimator for licensing purposes and for MS to estimate the size of the production environment. The self-service environments can be escalated more flexibly and quickly.

No RDP access

The access to the VM desktop has been removed because… well, I guess it’s because there’s no VM anymore. All the operations that could need us to access the RDP can be done from LCS.

No SQL Server access

Yes, no RDP access means no RDP access to the SQL box either. We still have access to the Azure SQL DB, we just need to ask for it from LCS and it’s granted in seconds:

Additionally you must whitelist your IP (or the one you’ll access SQL from) from the Maintain – Enable access button on LCS to be able to connect to the Azure SQL Server. The access to the DB and the firewall rule will be enabled for 8 hours.

As usual, there’s no access to the production DB.

One deployable package to rule them all

If you’ve recently tried to deploy a deployable package (DP) without all the packages the environment has (basically generating the DP for a single model/package from Visual Studio) you must’ve noticed the warning about the difference in the packages from the DP and the environment.

With the self-service deployments you must include all models/packages AND!! ISVs in one single deployable package.

Production updates

First, we can start the deployment to production without the 5 hours in notice we need to schedule now. We still can schedule the deployment but we can also start it instantly.

Next, the way the production environment is updated changes a bit from what we’re used to. With the new deployments we will update the sandbox environment as we do now, once it’s done we’ll select a sandbox environment to be promoted into production. This is probably another benefit of the architecture changes.

In the future the deployment downtime will also be reduced to zero for the service updates as long as you’re on the latest update. This won’t be available for custom DPs.

How do I get this?

At the moment this is only available for some new customers. Current customers will be migrated during the coming months, MS will contact the customers to schedule a maintenance window to apply the changes.

For more information check the session Microsoft Dynamics 365 for Finance and Operations: Strategic Lifecycle Services Investments from last June’s MBAS.

Our experience with it

We got into the private deployment preview program almost a year ago with one of Axazure’s customers. The customer is now live with the self-service environments and everything has been fine so far.

But the beginning was a bit hard. Some of the functionalities were still not available at the moment, like DB refresh or… package deployment. Yes, we needed to ask MS to deploy our DPs each time. We couldn’t even put the environments in maintenance mode! In the first months of 2019 a lot of functionality was added to LCS and in June we finally got the production self-service update functionality available. The help we’ve gotten from Microsoft’s product team has been very valuable and they have unlocked some issues that were stopping the progress of the project.

Slow set-based operations?

In Microsoft Dynamics 365 for Finance and Operations we can execute the CRUD operations from code in two different ways, record-per-record or set-based.

Microsoft’s recommendation is to always use set-based operations, if possible, as you can check on the Implementation Best Practices for Dynamics 365: Performance best practices for a successful Dynamics 365 Finance and Operations implementation session from last June’s Business Applications Summit.

Why?

Set-based Vs. Record-per-record

When we run a query in MSDyn365FO we’re using its data access layer which will later be translated into real SQL. We can see the differences using xRecord’s getSQLStatement with generateonly on the query (and forceliterals to show the parameter’s values) to get the SQL query. For example if we run the following code:

We’ll get this SQL statement:

SELECT TOP 1 T1.PAYMTERMID,T1.LINEDISC,T1.TAXWITHHOLDGROUP_TH,T1.PARTYCOUNTRY,T1.ACCOUNTNUM,T1.ACCOUNTSTATEMENT,
T1.AFFILIATED_RU,T1.AGENCYLOCATIONCODE,T1.BANKACCOUNT,T1.BANKCENTRALBANKPURPOSECODE,T1.BANKCENTRALBANKPURPOSETEXT
,T1.BANKCUSTPAYMIDTABLE,T1.BIRTHCOUNTYCODE_IT,T1.BIRTHPLACE_IT,T1.BLOCKED,T1.CASHDISC,T1.CASHDISCBASEDAYS,
T1.CCMNUM_BR,T1.CLEARINGPERIOD,T1.CNAE_BR,T1.CNPJCPFNUM_BR,T1.COMMERCIALREGISTER,T1.COMMERCIALREGISTERINSETNUMBER,
T1.COMMERCIALREGISTERSECTION,T1.COMMISSIONGROUP,T1.COMPANYCHAINID,T1.COMPANYIDSIRET,T1.COMPANYNAFCODE,T1.COMPANYTYPE_MX,
T1.CONSDAY_JP,T1.CONTACTPERSONID,T1.CREDITCARDADDRESSVERIFICATION,T1.CREDITCARDADDRESSVERIFICATIONLEVEL,T1.CREDITCARDADDRESSVERIFICATIONVOID,
T1.CREDITCARDCVC,T1.CREDITMAX,T1.CREDITRATING,T1.CURP_MX,T1.CURRENCY,T1.CUSTCLASSIFICATIONID,T1.CUSTEXCLUDECOLLECTIONFEE,
T1.CUSTEXCLUDEINTERESTCHARGES,T1.CUSTFINALUSER_BR,T1.CUSTGROUP,T1.CUSTITEMGROUPID,T1.CUSTTRADINGPARTNERCODE,T1.CUSTWHTCONTRIBUTIONTYPE_BR,
T1.DEFAULTDIMENSION,T1.DEFAULTDIRECTDEBITMANDATE,T1.DEFAULTINVENTSTATUSID,T1.DESTINATIONCODEID,T1.DLVMODE,T1.DLVREASON,T1.DLVTERM,
T1.EINVOICE,T1.EINVOICEATTACHMENT,T1.EINVOICEEANNUM,T1.ENDDISC,T1.ENTRYCERTIFICATEREQUIRED_W,T1.EXPORTSALES_PL,T1.EXPRESSBILLOFLADING,
T1.FACTORINGACCOUNT,T1.FEDERALCOMMENTS,T1.FEDNONFEDINDICATOR,T1.FINECODE_BR,T1.FISCALCODE,T1.FISCALDOCTYPE_PL,T1.FORECASTDMPINCLUDE,
T1.FOREIGNRESIDENT_RU,T1.FREIGHTZONE,T1.GENERATEINCOMINGFISCALDOCUMENT_BR,T1.GIROTYPE,T1.GIROTYPEACCOUNTSTATEMENT,T1.GIROTYPECOLLECTIONLETTER,
T1.GIROTYPEFREETEXTINVOICE,T1.GIROTYPEINTERESTNOTE,T1.GIROTYPEPROJINVOICE,T1.ICMSCONTRIBUTOR_BR,T1.IDENTIFICATIONNUMBER,T1.IENUM_BR,
T1.INCLTAX,T1.INSSCEI_BR,T1.INTBANK_LV,T1.INTERCOMPANYALLOWINDIRECTCREATION,T1.INTERCOMPANYAUTOCREATEORDERS,T1.INTERCOMPANYDIRECTDELIVERY,
T1.INTERESTCODE_BR,T1.INVENTLOCATION,T1.INVENTPROFILEID_RU,T1.INVENTPROFILETYPE_RU,T1.INVENTSITEID,T1.INVOICEACCOUNT,T1.INVOICEADDRESS,
T1.INVOICEPOSTINGTYPE_RU,T1.IRS1099CINDICATOR,T1.ISRESIDENT_LV,T1.ISSUEOWNENTRYCERTIFICATE_W,T1.ISSUERCOUNTRY_HU,T1.LINEOFBUSINESSID,
T1.LVPAYMTRANSCODES,T1.MAINCONTACTWORKER,T1.MANDATORYCREDITLIMIT,T1.MANDATORYVATDATE_PL,T1.MARKUPGROUP,T1.MCRMERGEDPARENT,
T1.MCRMERGEDROOT,T1.MULTILINEDISC,T1.NIT_BR,T1.NUMBERSEQUENCEGROUP,T1.ONETIMECUSTOMER,T1.ORDERENTRYDEADLINEGROUPID,
T1.ORGID,T1.OURACCOUNTNUM,T1.PACKAGEDEPOSITEXCEMPT_PL,T1.PACKMATERIALFEELICENSENUM,T1.PARTY,T1.PARTYSTATE,T1.PASSPORTNO_HU,T1.PAYMDAYID,
T1.PAYMENTREFERENCE_EE,T1.PAYMIDTYPE,T1.PAYMMODE,T1.PAYMSCHED,T1.PAYMSPEC,T1.PDSCUSTREBATEGROUPID,T1.PDSFREIGHTACCRUED,
T1.PDSREBATETMAGROUP,T1.PRICEGROUP,T1.RESIDENCEFOREIGNCOUNTRYREGIONID_IT,T1.RFC_MX,T1.SALESCALENDARID,T1.SALESDISTRICTID,
T1.SALESGROUP,T1.SALESPOOLID,T1.SEGMENTID,T1.SERVICECODEONDLVADDRESS_BR,T1.STATEINSCRIPTION_MX,T1.STATISTICSGROUP,T1.SUBSEGMENTID,
T1.SUFRAMA_BR,T1.SUFRAMANUMBER_BR,T1.SUFRAMAPISCOFINS_BR,T1.SUPPITEMGROUPID,T1.TAXGROUP,T1.TAXLICENSENUM,T1.TAXPERIODPAYMENTCODE_PL,
T1.TAXWITHHOLDCALCULATE_IN,T1.TAXWITHHOLDCALCULATE_TH,T1.UNITEDVATINVOICE_LT,T1.USECASHDISC,T1.USEPURCHREQUEST,T1.VATNUM,
T1.VENDACCOUNT,T1.WEBSALESORDERDISPLAY,T1.AUTHORITYOFFICE_IT,T1.EINVOICEREGISTER_IT,T1.FOREIGNERID_BR,T1.PRESENCETYPE_BR,
T1.TAXGSTRELIEFGROUPHEADING_MY,T1.FOREIGNTAXREGISTRATION_MX,T1.CUSTWRITEOFFREFRECID,T1.ISEXTERNALLYMAINTAINED,T1.SATPAYMMETHOD_MX,
T1.SATPURPOSE_MX,T1.CFDIENABLED_MX,T1.FOREIGNTRADE_MX,T1.WORKFLOWSTATE,T1.USEORIGINALDOCUMENTASFACTURE_RU,T1.COLLECTIONLETTERCODE,
T1.BLOCKFLOORLIMITUSEINCHANNEL,T1.AXZMODEL182LEGALNATURE,T1.AXZCRMGUID,T1.MODIFIEDDATETIME,T1.MODIFIEDBY,T1.CREATEDDATETIME,T1.RECVERSION,
T1.PARTITION,T1.RECID,T1.MEMO 

FROM CUSTTABLE T1 

WHERE (((PARTITION=5637144576) AND (DATAAREAID=N'usmf')) AND (ACCOUNTNUM=N'0001'))

 

We can see all the fields are being selected, and the where clause contains the account number we selected (plus DataAreaId and Partition).

When a while select is run on MSDyn365FO a select SQL statement is executed on SQL Server for each loop of the while. The same happens if an update or delete is executed inside the loop. This is know as record-per-record operation.

Imagine you need to update all the customers with the customer group 10 to update their note. We could do this with a while select, like this:

This would make as many calls as customers from the group 10 existed to SQL Server, one for each loop. Or we could use set-based operations:

This will execute a single SQL statement on SQL Server that will update all the customers with the customer group 10 instead of a query for each customer:

UPDATE CUSTTABLE 
SET MEMO = 'Special customer' 
WHERE (((PARTITION=5637144576) 
AND (DATAAREAID=N'usmf')) 
AND (CUSTGROUP=N'10'))

There’s three set-based operations in MSDyn365FO, update_recordset to update records, insert_recordset to create records and delete_from to delete the records. Plus we can make massive inserts using RecordSortedList and RecordInsertList.

Running this methods instead of while selects should obviously be faster as it’s only executing a single SQL query. But…

Why could my set-based operations be running slow?

There’s some well-documented scenarios in which set-based operations fall back to record-per-record operations as we can see in the following table:

DELETE_FROM UPDATE_RECORDSET INSERT_RECORDSET ARRAY_INSERT Use … to override
Non-SQL tables Yes Yes Yes Yes Not applicable
Delete actions Yes No No No skipDeleteActions
Database log enabled Yes Yes Yes No skipDatabaseLog
Overridden method Yes Yes Yes Yes skipDataMethods
Alerts set up for table Yes Yes Yes No skipEvents
ValidTimeStateFieldType property not equal to None on a table Yes Yes Yes Yes Not applicable

In the example, if the update method of the CustTable is overriden (which it is) the operation from the update_recordset piece will be run like a while select that updates each record.

In the case of the update_recordset this can be solved calling the skipDataMethods method before running the update:

This will avoid calling the update method (or insert in case of the insert_recordset), more or less like calling doUpdate in a loop. The rest of the methods can be overriden with the corresponding method on the last column.

So, for bulk updates I’d always use set-based operations and enable this on data entities too with the EnableSetBasedSqlOperations property.

And now another but is coming.

Should I always use set-based operations when updating large sets of data?

Well it depends on which data you’re working with. There’s a wonderful blog post from Denis Trunin called “Blocking in D365FO(and why you shouldn’t always follow MS recommendations)” that shows a perfect example where set-based operations would be counterproductive.

As always, developing an ERP is quite sensitive, and similar scenarios can have different solutions. Analyze the requirements and decide which one to use.

Update to Visual Studio 2019 for #MSDyn365FO

Tired of developing in Visual Studio 2015? You feel you’ve been left and forgotten in the past? Worry no more, you can use Visual Studio 2017/2019 to develop Microsoft Dynamics 365 for Finance & Operations!

What are the advantages?

Absolutely none at all! Visual Studio will still go non-responding whatever the version is because it’s the dev tools extension what’s causing the issues.

Of course we get the option to use Live Share, and for screen sharing sessions that’s way better than teams. Hey, and we’ll be using the latest VS version!

Is it difficult?

No, zero mysteries. The first thing we need to do is downloading Visual Studio 2019 Professional (or Enterprise but it won’t make such a difference for D365 development) and install it:

Select the .NET desktop development option and press install. When the installation is finished we log in with our account.

The next step is installing the Dynamics developer tools extension for VS. Go to drive K and in the DeployablePackages you’ll find some ZIP files that have the extension in the DevToolsService/Scripts folder:

An alternativa is, for example, downloading a Platform Update package which also has the dev tools extensions, and maybe with some update to them.

Install the extension and the VS2019 option is already there:

Once installed open VS as the admin and…

And also…

Don’t panic! The extension was made for VS2015 and using it in a newer version can cause some warnings, but it’s just that, the tools are installed and ready to use:

As I said in the beggining, the dev tools extension is the one causing the unresponsiveness or blocks in VS, and Visual Studio 2019 is letting us know:

But regardless of the warnings working with Visual Studio 2019 is possible. I’ve been doing so for a week and I still haven’t found a blocking issue that makes me go back to VS2015.

Dev tools preview

In October 2019 the dev tools’ preview version will be published, as we could see in the MBAS in Atlanta. Let’s see which new features this will bring us both in a possible VS version upgrade or performance.

Consume a SOAP web service in Dynamics 365 for Finance and Operations using ChannelFactory

If you ever need to consume a SOAP web service from Dynamics 365 for Finance and Operations, the first step you should take is asking the people responsible for that web service to create a REST version. If that’s not possible this post is for you.

I’ll use this SOAP web service I found online at http://www.dneonline.com/calculator.asmx for the example, it’s a simple service-calculator with four methods to add, substract, multiply or divide two integers.

Consuming a SOAP service in .NET

Let’s start with the basics. How do we consume a SOAP web service in Visual Studio? Easy peasy. Just add a service reference to your project:

And point it to the web service of your choice:

This will add the reference to the project:

With that done we can create an instance of the web service’s client and call one of it’s methods:

3 + 6 = 9, it looks like it’s working.

Consuming a SOAP service in Dynamics 365 for Finance and Operations

To consume the web service on FnO create a new project in Visual Studio, right click on References and add the service reference:

Hmmm… nope, it can’t be done, no service reference option.

Consuming a SOAP service in Dynamics 365 for Finance and Operations (I hope…)

The problem is that we cannot add a service reference in Visual Studio on 365 dev boxes.

What do the docs say about this? Well, like in AX2012 we need to create a .NET class library that will consume that web service, then add the reference to our DLL on 365 and call the service methods from a client object. All right!

There it is. A reference to our class library and a runnable class that will do the job:

Let’s run it!

What?

An exception of type ‘System.InvalidOperationException’ occurred in System.ServiceModel.dll but was not handled in user code

Additional information: Could not find default endpoint element that references contract ‘AASSOAPCalculatorService.CalculatorSoap’ in the ServiceModel client configuration section. This might be because no configuration file was found for your application, or because no endpoint element matching this contract could be found in the client element.

Contract? What contract? I know nothing about a contract. Nobody told me about any contract! What does the Wikipedia say about SOAP?

Soap is the term for a salt of a fatty acid or for a variety of cleansing and lubricating products produced from such a substance.

Oops wrong soap…

SOAP provides the Messaging Protocol layer of a web services protocol stack for web services. It is an XML-based protocol consisting of three parts:

  • an envelope, which defines the message structure and how to process it

  • a set of encoding rules for expressing instances of application-defined datatypes

  • a convention for representing procedure calls and responses

The envelope is the contract. A data contract is an agreement between a service and a client that abstractly describes the data to be exchanged. That contract.

Consuming a SOAP service in Dynamics 365 for Finance and Operations (I promise this is the good one)

If we check the class library there’s a file called app.config:

In this file we can see the endpoint the DLL is using. This is fixed (hardcoded) and in case there’s a test endpoint and a production endpoint we should change the address accordingly and have two different DLLs, one for each endpoint. We can also see the data contract being used by the service, the one called AASSOAPCalculatorService.CalculatorSoap. Because #MSDyn365FO is a web-based ERP we could solve this by adding the system.serviceModel node in the web.config file of the server, right? (app.config for desktop apps, web.config for web apps). Yes, but this would be useless as we have no access to the production environment to do this, and it will be impossible to do in the sandbox Tier-2+ environments when the self-service environments start to roll out.

So, what do we do? Easy, ChannelFactory<T> to the rescue! The ChannelFactory<T> allows us to create an instance of the factory for our service contract and then creates a channel between the client and the service. The client being our class in D365and the service the endpoint (obviously).

Then we do the following:

The BasicHttpBinding object can be a BasicHttpsBinding if the web service is running on HTTPS. The endpoint is the URL of the web service. Then we instantiate a service contract from our class with the binding and enpoint and create the channel. Now we can call the web service methods and…

It’s working! And it’s even better, if there’s different endpoints for a test and prod web service we just have to parametrize them!

But really, don’t use SOAP services, go with the REST.

Do you want to become a better X++ developer?

I’ve been a X++ developer for almost 10 years, that’s the 100% of my professional career, excluding internships. During these 10 years I’ve seen the product evolve and, in my opinion, the last three years with #MSDyn365FO have been the most exciting by far as I’ve said several times.

The move from the notepad-like MorphX to Visual Studio, Azure DevOps and the asset upload and release tasks make me feel like a real software developer. And this has been only the beginning of the journey, we’re now starting with testing automation with RSAT and the ATL, we’ll (hopefully…) finally do testing!

And how can we be better X++ developers?

It takes time

Like learning any other thing. You know nothing on day one, you learn things mainly by doing them, and with time you realize that the ERP is huge and you just know a small portion. Keep learning. Time will pass and you’ll realize that you  still know a small portion of Operations.

Love your job

This one might be hard sometimes… be passionate about what you do. Find a company that helps you grow, try having fun at work, it will be difficult, like during go-lives, but even in those moments there’s time for laughs. With passion the rest is easier.

Functional knowledge

Obviously developers need to know how the processes work from a functional point of view. In case of doubt ask your functional colleagues, don’t waste time digging through the code trying to understand the functionality. After the functional explanation you’ll see the code more clearly.

I always say that programming in X++ is easy, the difficult part is knowing the business processes.

Learn other languages

Get outside X++. Working (or playing) with a different language can help you lose or soften the vices you may have gotten with AX.

Developers usually know more than one language, from previous jobs or from pet projects. C# is obviously a good choice, because we can use .NET libraries in X++ code or we can create ours. Learn the syntax (easy), try the foreach (I’d love having this in X++), LINQ, etc.

I also used to think that, at some point in the future, X++ would be completely replaced by .NET/C# so learning .NET was a good idea. But seeing the latest investments in X++ like SysDa or the ATL I have some doubts in the mid term. Plus X++’s data access layer is wonderful.

Explore Azure

Including DevOps. Luckily there’s no option not to use DevOps. But just don’t use it as a source control tool. It’s waaaay more that that.

Explore Azure, it’s huge and the solution to a problem can be there. Azure functions, Logic apps, Azure SQL, Service bus (combined with Business events for example). It’s not AX by itself anymore, 365 comes with friends on the cloud.

Power Platform

After the last MBAS it’s crystal clear that Microsoft is investing a lot into the PowerPlatform. Flow, PowerApps, AI Builder… All these products can be integrated with MSDyn365FO.

A PowerApp can be used instead of a mobile workspace, Flow to send emails when triggered by a Business Event or a CRUD operation.

Learn something about CRM and CDS, you’ll have to integrate them with FnO at some point, for sure.

Share and teach

For me teaching is reaaaaaaaaaaaaaaally difficult, I’m a terrible teacher, the things in my head are clear but the link between my head and my mouth is broken. I find very hard to turn my thoughts into words. Writing things down helps me put things in order, because I can write and delete, and write again, and again 🙂

Share your knowledge, do internal training with your colleagues, be a speaker. I never tought about that until I started at Axazure, and when I was offered being a speaker at Dynamics 365 Saturday my first thought was “Me? What can I tell that could be interesting to people?”. In the end you just need to pick a topic you know a bit (or nothing at all) of and expand your knowledge, or have stupid ideas and bring them to life!

These are just some ideas, there’s lots of thing that can be done to improve, but the most important is patience. Time and patience.

Microsoft Business Applications Summit 2019

During the past week I attended the Microsoft Business Applications Summit (MBAS) with some of my colleagues at Axazure. It was my first time in an event like this and the experience has been wonderful and intense (and tiring). The event took place from Sunday to Tuesday and traveling from Madrid to Atlanta and back, including a 5 hour delay in Toronto and a 6 hour layover in Newark, in just 5 days was exhausting.

The event’s organization has been really really good, with lots of sessions from the product teams who would answer all the attendants’ questions. The only bad thing has been the overlapping of sessions and not being able to see all of them in person. Hopefully all sessions have been recorded and are already available online.

Another good thing of the MBAS is that there’s more than #MSDyn365FO, it’s all of Microsoft’s business apps in the same place and you can learn some things that are out of your area. In my case it was a workshop about the new AI Builder for PowerApps. Microsoft is really investing a lot of efforts into the PowerApps, Flow and CDS and I think it’s a good idea that us, FnO folks, should start using all of them in our projects because that’s where the product is heading. If I’m lucky enough to attend the MBAS next year in Anaheim, I think I’ll try to go to as many workshops as possible and see if some ideas to integrate new things into MSDyn365FO come to me.

Some interesting sessions for technical people were:

There’s a lot more stuff about CDS, PSA, RSAT and other things and products. Just go and check the sessions.

You can read Juanan’s post here. Axazure’s blog post and Demian’s here.