Skip to main content

Notifications

Dynamics 365 Community / Blogs / Learn with Subs / Tired of Azure Functions ? ...

Tired of Azure Functions ? Switch to Azure Data Mapper for Logic Apps transforms

Subhad365 Profile Picture Subhad365 5 User Group Leader

Azure data mapper is a very convincing way of transforming your data schemas, without involving little or no requirement for coding, and a easy to use:
a. Drag and drop facilities
b. A rich built-in library contaning a good number of assisting functions 
c. You can manually test the mapping/preview the data.
d. Ability to Debug
e. Ability to use them on API Management services and Logic Apps very easily
f. An overview pane to review if your mappings are correct.

A lot of emphasis have been laid on transforming myriad file types:
a. XML to XML
b. JSON to JSON
c. XML to JSON and vice versa

However, yes, there are limitations too. As of now:
  • Data Mapper currently works only in Visual Studio Code running on Windows operating systems.
  • Data Mapper is currently available only in Visual Studio Code, not the Azure portal, and only from within Standard logic app projects, not Consumption logic app projects.
  • Data Mapper currently doesn't support comma-separated values (.csv) files.
  • The Data Mapper's Code view pane is currently read only.
  • The map layout and item position are currently automatic and read only.
  • To call maps created with the Data Mapper tool, you can only use the Data Mapper Operations action named Transform using Data Mapper XSLT. For maps created by any other tool, use the XML Operations action named Transform XML.
  • To use the maps that you create with the Data Mapper tool but in the Azure portal, you must add them directly to your Standard logic app resource.
Nevertheless, if you wanna avoid a lot of coding and maitenance (and of course Cost), you can consider using Data Mappers.
The following example gives an step-by step process of how to acheive the same.
Prerequisities:
You must have VS Code installed on your computer. Download the following extensions:
  1. Azure
  2. Azure Logic Apps standard
  3. Check if your Azurite is running on your VS Code: got to VS Code >> View >> Command palette >> Azurite start 

Azurite is an emulator for your local computer, enabling which could let you run your code locally, before actually deploying them on your Azure portal.
Ok, all set! Let us create the data mapper now.
Uh, no -- before that: let me give you the scenario. We are having a situation, where the client is having a sales order XML coming up with various details:
 
Which evidently implies that the sales order is having various items on its lines. It also having the customer details along with it.
We have to create an output, which will
a. be able to join the Title, first name, last name into one field
b. be able to count how many quantities are associated with it.
c. be able to sum up the line amount:
  
Make sure that your have the XSDs files for them ready. 
Our data mapper will do this mapping in the subsequent steps given below: 
Step -1: Click on Azure extension >> Create new data map:

Choose 'Create a new project' from the prompt and create a new folder for your data mapper: Give a name 'DataMapperForSalesOrderSummary'.
Choose 'Stateful workflow' from the template prompt:

Here we would be using a stateful workflow (logic apps) for our demo. Give a suitable name: SalesOrderSummaryWorkflow
Give it some time to get loaded.
Step -2: Give a suitable name to your data map: SalesOrderSummarymap. This will result in the following screen to popup. 

Go ahead to upload your XSDs for source and target sections:

Choose your XSD document by browsing and adding it. Your resulting screen would look like this now, having both the input and output schemas:

Now you could only see the output schema fields. The input schema fields could be seen if you choose on 'Show source schema' button as shown below:

Step -3: Once you have clicked on 'Show source schema', you can see the schema fields.


Now you need to select from the input fields, all what you need to select for the mapping. This is the most import step: it's like selecting the necessary fields from a query statement.

Step -4:Now start mapping from input to output, as is shown below for the below fields (just click the right end of source box to the left end of the target box).

For now, let's not touch the other fields.  
Click on the function from the highlighted icon and add 'Concatenate' function:

The following Concat function will be shown:

This is where we would be concatenating the title, first name and last name of the source. Click on it and do the following settings:

Choose from the drop down and select necessary fields. I have given a space between Title-Firstname and Firstname-Lastname, so the resultant field should look as shown in the highlighted section.
Now the Concat function looks like this:

Click on the top right radio button of the function and strecth it to connect to 'CustName' of the target schema.

Let us now add a similar function to add the sum the Qty and line Amount. Go ahead and add another function and choose: 'Sum' from 'Collection' functions:

This will create a function called Sum, as expected:

Click on it to configure. Select 'Qty' from Value collection and 'Sales id' from 'Scope collection'.


Which means I am asking it to sum all the Qty from a given order Sales Id. Once done, you can map this function to the TotalQty field of the target schema:

Repeat this for Line Amount. Now the resultant map looks like this:

Click on 'Overview' to visualize the mapping.

You can see the overview of column by column mapping of the fields. You might have been wondering why is that 'Test' button at the top is disabled? Yes, click on 'Save' and 'Generate XSLT' to start the Test:

The window that opens let us you paste a test data as an input and submit it. 

The result would be shown under the 'Output' tab:

Which sums the qty, line amount and also concatenates the customer names.
Step -5: Now come back to your zure logic apps, and right click on it to come to your designer mode:

Here you can add a sample logic app design, that accepts an HTTP request as a trigger, then passes on the payload to the mapper and subsequently feeds it to a HTTP response to show the output:

Rest is very simple. Start the project by Clicking from 'Run' >> Start debugging. Come back to your azure logice apps, and right click on it, select the 'Overview':

This will give you the URL to test. Power up your Postman, paste this URL and select XML as Body type:

Click to send. The output will be as expected the same XML outcome which we expected.
Imagine how much time could it lower, than to write a full stack of classes and managing them using Azure functions. Data mappers could also help you creating Data packages, for D365FO, expecially when you are using recurring integrations. 
So much for today, will come back with more such cool features of Azure and D365FO soon. Take care, much love and Namaste....

Comments

*This post is locked for comments