This post extends my several previous posts but with another messages format. We use an XML format and try to import files in X++ from Azure File Share and create purchase orders.
I use a very simplified example of integration with the INFOR system. This system has an integration component that can export the data in XML format, and we will process(import) these files in D365FO.
What is good that a file generated from INFOR is probably the most complex format that you may see in XML integrations. It has a custom namespace, data in tags, data in attributes, data in the similar elements. Here is a sample file context:
<?xml version="1.0" encoding="UTF-8"?>
<SyncPurchaseOrder xmlns="http://schema.infor.com/InforOAGIS/2" languageCode="en-US">
<DataArea>
<PurchaseOrder>
<PurchaseOrderHeader>
<DocumentID>PO00001</DocumentID>
<DocumentDateTime>2022-04-16</DocumentDateTime>
<SupplierParty>
<PartyIDs>
<ID accountingEntity="USMF">1001</ID>
</PartyIDs>
</SupplierParty>
<ShipToParty>
<Location type="Warehouse">
<ID accountingEntity="USMF">11</ID>
</Location>
</ShipToParty>
<UserArea>
<Property>
<NameValue name="eam.UDFCHAR01" type="StringType">Requestor Name</NameValue>
</Property>
</UserArea>
</PurchaseOrderHeader>
<PurchaseOrderLine>
<LineNumber>10</LineNumber>
<Quantity unitCode="EA">15</Quantity>
<UnitPrice>
<Amount currencyID="AUD">10</Amount>
</UnitPrice>
<UserArea>
<Property>
<NameValue name="eam.UDFCHAR01" type="StringType">M0001</NameValue>
</Property>
</UserArea>
</PurchaseOrderLine>
<PurchaseOrderLine>
<LineNumber>20</LineNumber>
<Quantity unitCode="EA">2</Quantity>
<UnitPrice>
<Amount currencyID="AUD">10</Amount>
</UnitPrice>
<UserArea>
<Property>
<NameValue name="eam.UDFCHAR01" type="StringType">M0004</NameValue>
</Property>
</UserArea>
</PurchaseOrderLine>
</PurchaseOrder>
</DataArea>
</SyncPurchaseOrder>
We will read this file into the two staging tables(header and lines) and then create a purchase order based on these tables.
In order to deal with such a complex format I created a helper class DEVIntegXMLReadHelper. It uses the standard System.Xml* methods and provides several helper methods that simplify the code structure:
//set up namespace alias(used in search)
readHelper.initNamespace(xmlDoc, 'infor', @'http://schema.infor.com/InforOAGIS/2');
//read a typed data from the specified tag
header.PurchId = readHelper.xmlGetValueNodeStr(nodePOHeader, 'DocumentID');
line.PurchPrice = readHelper.xmlGetValueNodeReal(nodePOLine, 'UnitPrice/Amount');
//search node inside the specified node by path
node = readHelper.xmlGetNodeByPath(nodePOHeader, 'SupplierParty/PartyIDs/ID');
header.OrderAccount = node.get_InnerText();
//search node inside the specified node by search string
node = readHelper.xmlSelectSingleNode(nodePOHeader, 'infor:UserArea//infor:Property//infor:NameValue[@name=\'eam.UDFCHAR01\']');
header.RequestorName = node.get_InnerText();
//getting the data from the attribute
xmlAttributeCollection = node.get_Attributes();
xmlAttribute = xmlAttributeCollection.get_ItemOf("accountingEntity");
header.CompanyId = xmlAttribute.get_InnerText();
The full class is here DEVIntegTutorialProcessPurchConfirmXML
In this blog post, I describe a solution(with "Consuming external web services" type) for a D365FO Custom Service that will talk to the Integration Platform. It will do the following:
Our integration should read messages from the Azure file share and create purchase orders in D365FO.
The diagram that describes the process is below:
In the following section, I provide a solution that can be used as a starting point to import and process messages from Azure file share.
Initially, our XML files will be loaded to the Azure storage and then processed by D365FO.
Connection types form allows specifying the Connection type resource(currently, it supports Azure file share, Azure Service Bus and SFTP) and connection details for the selected resource.
For storing a connection string, there can be three options:
The next form to describe our integration will be the Inbound message types form:
This form contains three main sections:
1 - Details tab
Defines Azure file share connection and a path to folders to import messages.
Contains a link to the custom Class that will do processing from the queue. The class should extend the base class DEVIntegProcessMessageBase and implement the following method:
abstract void processMessage(DEVIntegMessageTable _messageTable, DEVIntegMessageProcessResult _messageProcessResult)
{
...
}
The integration engine will call this method outside of a transaction so that all transaction control can be implemented per Message type. There can be different options here: one transaction per message, multiple transactions(for example, if a file contains several independent journals) or a single transaction per line. The result of processing and the current stage should be written to _messageProcessResult variable, so in case of an unhandled exception, this information can be saved for review. Also, this class will be created one time per entire import session so it can implement different caching options.
2 - Operation parameters tab
Contains parameters that are individual for the current operation. In our case, we don't have any parameters.
3 - Advanced settings tab
Contains some common parameters:
The number of parallel threads to process incoming messages.
Reading limits from the queue(e.g. to read only the first three messages, this is for testing purposes)
Also, this form contains service operations:
This table will store the details of each incoming message.
Every message has a status field that can contain the following values:
In this form, it is also possible to do the following operations:
It is a periodic batch job that we can run for one or multiple message types.
It connects to the Azure file storage, reads a file, creates a record in the Incoming messages table with Ready status and attaches a message content to this record. After this, the file is moved to the Archive folder. If Run processing is selected, the system will execute the processing of the loaded messages.
Message processing may be executed as a separate operation - Process incoming messages that selects all not processed messages and calls the processing class for them.
The logic of how to process the file is different per message type/class. For a simple scenario, the class can just read the message content and create some data in one transaction. For this blog post, I implemented two-step processing. Also, for the Process operation, you can specify the number of Processing attempts per file. The system tries to process a message and in case of any error, it increases the message Processing attempts counter. This allows retrying processing several times without a user involvement.
See a sample diagram below:
The class reads the message data and writes data into a staging table during the first step. Then based on the staging data values, a new purchase order is created. It can be done using data entities or just directly writing to tables.
It is not a big task to create a purchase order based on some data. The complexity of integration is often related to exception processing and error monitoring. Let's discuss typical errors and how users can deal with them.
A batch job exception will be generated if our batch job can't connect to an Azure file share or read files. It is a configuration error requiring a system administrator's attention. Notification may be done using a batch job status. After troubleshooting, the system administrator can use the "Test connection" button to validate that the system can now read files from Azure file share.
The next error type is the wrong file format, so we can't even read the data from a message.
To test this case, I renamed one XML element. After the import, users will see this message with the Error status. Notification may be done using standard filtering by the Status column.
Users can view an error log, then download the message and check the reason for this error. For example, there may be the following reasons:
The message has a correct structure but contains incorrect data(e.g. values that don't exist). In this case, a Status of our Message will be Error, and an error log will be generated.
Users can view this error, display a Staging data to check the values from the message and take some actions(e.g. create missing values in the related tables if they are valid). After that, they can Process this message again.
In some implementations(EDI), we can even allow staging data editing. A similar type of error is a posting error.
That is probably the worst scenario. The message was processed successfully, but the resulting document contains some incorrect data.
Users can view the staging data and check that they are correct to analyse the result. Also, a message content can be sent to a developer, and they may trace the document processing using the manual Import message button
I provided a sample implementation for an Azure File Share integration for D365FO. The main concept of it is to create a basic framework to simplify troubleshooting(most typical errors and all related data can be viewed in one form - Incoming messages) and provide some additional logging.
In the series of posts I provided samples for different file formats used for import:
Different most complex D365FO documents:
And different exchange media:
This may or may not be appropriate in your case(there are different options how to implement this). Anyway, I recommend using the following checklist while designing the integration: Integration solution specification.
Files used for this post can be found in the following folder
I hope you find this information useful. As always, if you see any improvements, suggestions or have some questions about this work, don't hesitate to contact me.
Similar posts: