Skip to main content

Notifications

Announcements

No record found.

Dynamics 365 Community / Blogs / Learn with Subs / Using Azure API integration...

Using Azure API integration with D365FO: Strategy and walktrhough

Subhad365 Profile Picture Subhad365 7 User Group Leader

Azure API: quick overview

Azure APIs are a stub of reliable connecters to any external world that can let you connect your application to talk to any 3rd party appliance. Azure API in itself is a huge sea of possibilities:

  • a.       letting you parse a web request using its rich NEWTON-SOFT based policies
  • b.      letting you further call a function app
  • c.       letting you further call a logic app
  • d.      letting you further call another OpenApi/SOAP based web service




The most appreciative features of Azure API are:
a.       high availability: owing to being the integral part of MS Azure offering, Azure APIs can offer you an undisrupted service
b.      flexible payment options like
I.                    Developer/no SLA: choose this option as long as you are  on a developer/preproduction environment
II.                  Basic: 99.9 SLA %
III.                Standard: 99.99 SLA %
IV.                Premium: 99.95 % SLA
V.                  Consumption: 99.9 SLA %
c.       Security: You can implement various security models like
I.                    Bearer token based(oAuth2.0)
II.                  OCP-APIM subscription key based
III.                Also custom security in the APIM policy itself
d.      Role based accesses: when I request for a product based API à /home/products, only the products based details would be shown. But when I request/home/HR, only human resource based operations would be shown. Access Control (IAM) is a smart way to leverage the capability of role assignments to API.
e.      Enable change logs to trace out versioning and editions
f.        Enable Metrics and activities to understand all the activities that your API participates (the calls made to your API, the administrative services that were carried onto your API, etc.)
g.       Certificates: you can install certificates into your API so as to make it to access a restricted URL seamlessly
h.      Virtual network management: you can setup your own virtual network using your API management:
I>                  External: selecting your VNET and subnets
II>                Internal: selecting your VNET and subnets
i.         Adding a LOCK and thereby sanctioning a ‘resource group’ and subscription only to access it.
j.        Managed identities: a system managed identity enables Azure resources to authenticate to cloud services without storing credentials in code
k.       You can simply scale up and control your pricing by visiting ‘Scale and pricing’ tab.
l.         Identities: they let you manage authentication methods for users of the developer portal. By default, API management requires username and password authentication, but you can also configure online identity providers like AAD or AAD B2C.
m.    Support + troubleshooting:
I>                  Health history tells you if your API is running as expected by analysing your resources
II>                New support request: you can upfront raise a ticket by providing your issue details like(basics, solution, details, etc.).

The API operation circuitry

Creating an Azure API

Create a new Azure API

Browsing down to Azure portal à API management à you can reach out to the API management corner of the portal à create new:
a.       Name: give a proper name(without beginning with _ sign)
b.      Subscription: select a proper subscription
c.       Resource group: ideally a resource group is what which lets you keep all the similar resources together under one umbrella for all related payments.
d.      Location: choose where would you like to use the GEO of your service provider.
e.      Administrator Email:  this is by default the same email of your organization/email Id with which you have logged on.
Click create to continue. It takes around 15 minutes to deploy and make the API up and running.

Create a service operation

You can now create An API operation from a host of available options:
a.       Blank API à create an empty API and then keep on adding your crops as per your needs
b.      OpenAPI à Standard, language-independent interface to rest {API}
c.       WADL à standard XML representation of your RESTful API
d.      WSDL à standard XML representation your SOAP API
e.      Logic App à you can invoke an already created logic app as a API URL as scalable hybrid integration and workflows
f.        App service à API hosted on App service
g.       Function App à serverless, event driven  utility on App service
For our example purpose, let us create a Blank API
a.       Display name: give a suitable name here(avoid using an _ sign)
b.      Name: by default this is same as Display name
c.       Web service URL: this is by default the backend service that you will be redirecting. Or else you can create this as empty.
d.      API URL Suffix: this should be the subdirectory that will be created and exposed. Ideally your API now looks like: https://apimanagementservice.azure-api.net/operation1
Click ‘Create’ to validate/finish.

Settings

You can validate various information provided in the previous step
a.       Display name: described above
b.      Name: described above
c.       Description: you can optionally provide a description as to what your API is actually doing
d.      We service URL: this is the backend URL you are connected to
e.      URL scheme: specify here if your API is leveraging HTTP or HTTPS or both
f.        API URL suffix: described above
g.       Tags: you can class all your APIs under one hood of a tag for easier searching and references
h.      Product: give it Unlimited so as not to limit the API calls
i.         Gateways: this is by default saved as Managed
j.        Subscription:
I>                  If you mark this as subscription required then mandatorily you need to provide a
II>                Header name: this is how the calling party needs to provide the ‘OCP-apim-subscription-key’
III>              Query parameter name: you can give a custom name to query parameter name

Security:

You can optionally setup a oAuth2.0 security. Selecting this option, the system will ask the oAuth 2.0 server(you can set this in oAuth2.0 tab of the Azure API)

Policies

Policies are a collection of statements that let you control the demeanor of the API through configuration: either run time or design time. In order to see/edit the policies, you need to click on the box named: base from either of inbound/outbound/backend processing.

Examples:

a.       We have a requirement to purse an incoming JSON payload that contains an ‘URL’ node. Consider an example like this:
{
“Message”:
{
                “Id”: “HSHDCBMK1898989686881CDY”,
                “entityId”: “5FJYK173391CF45”,
                “eventtype”: “Item_Created”,
                “url”: “https://int.mynetwork.com/item/v1/FFWEE167757711KJHF33”,
                “createdby”: “md347”
}
}
For that, you can declare a variable called msgURL and then store the value of the parsed message in it. Browse down at the inbound section of the URL:
<set-variable name= “msgURL” value=”@{
string json = (string)context.Variables[context.Request.Body.As<JObject>()];
JObject rss = JObject.Parse(json);
String rssMsg = (string)rss.SelectToken(“Message”);
string url = (string)jsonObj[“url”];
return url;
}” />
b.      Visiting another URL: suppose we need to visit another URL as a part of the process. Several websites do that, eg: to obtain token, cascaded calls à one API calling another API. And you need to pass on a payload as a body when making the request. The following example gives you an idea whereby you can set a variable URL on a call( for example the URL obtained from above example):
<send-request mode=”new” response-variable-name=”itemGETAPI” ignore-error=”true”>
<set-url>
@{
return ((string)context.Variables[“msgURL”]);
}
</set-url>
<set-method>GET</set-method>
</send-request>
Alternately you can specify an authentication thumbprint as:
<authentication-certificate-thumbprint = “****THUMBPRINT_VALUE***” />
This value you can obtain by uploading a CERT file in the azure API à CA certificates section. This kind of activities are more applicable for visiting APIs that do not allow open/direct calls.
c.       POST calls: you can also make a post call to another API and analyse a response:
<send-request mode=”new” response-variable-name=”TestPostAPI” ignore-error=”true”>
<set-url>
@{
return “https://apimanagement/azure-api.net/create/items”;
}
</set-url>
<set-method>POST</set-method>
<set-header name=”content-type” exists-action=”override”>
<value>application-json</value>
</ set-header >
<set-body>
@{
JObject itemBody = ((IResponse)context.Variables[“itemGETAPI”]).Body.As<JObject>();
}
</set-body>
</send-request>
Analysing the response:
<return-response>
<set-status code=”@(((IResponse)context.Response).StatusCode)” reason=”@(((IResponse)context.Response).StatusReason)” />
<set-header name=”content-type” exists-action=”override”>
<value>application-json</value>
</ set-header >
<set-body>
@{
JObject callResult = ((IResponse)context.Variables[“TestPostAPI”].Body.As<JObject>();
Return callResult.ToString();
}
</set-body>
</return-response>
 

Difference between Send-Request and send-oneway-request:

A Send-request is a synchronous call to an end URL, letting you receive a response and analysing it for further use. An one way request is more like an asynchronous call with no return response.

Further Examples: 

Situation: problem statement

We have a 3rd party logistics system who creates Purchase orders in DAX, validates and creates GRN in AX and subsequently posts them.

Solution design

a.       A batch job that creates an Azure storage file in a shared Azure folder.
b.       Custom Ax services pick up the file and creates Purchase orders in D365
c.       The third party receives the purchase order in their own system and tries to replicate the same in Ax. For that
I.                     It needs to call an Ax defined class that validates various information for a given purchase order.
II.                   In order to do this you can define an action/method in an exposed entity in D365, which in turns Class method from Step I. You can call this action from a logic app.
III.                 If returns Ok/true, it calls Step IV. Else validation fails
IV.                It calls another action and pushes a record to a staging table in Ax
V.                  A batch picks up from this staging table and creates/posts packing slips.
For our discussion’s sake we are concentrating for now on Azure API and logic apps’ interactions. For creating and managing batches using Azure folder, could be discussed separately on another blog.

Step 1: the staging table

We can make a staging table that primarily consist of the following information/fields:

Table name

DEMImportPurchaseGRNTable

Fields

FieldnameField typeExtendsMandatory
PurchIdString(20)PurchIdYes
LineNumRealLinenumYes
QtyRealQtyYes
ItemIdString(20)ItemIdYes
ImportIdString(20)NumNo
StatusEnumNew Enum: DEMProcessStatus
Values: In-process, Processed, Error
No

Properties

I.                     Caching: Not_In_Transaction
II.                   Table group: Worksheet line
III.                 Label: provide suitable label
IV.                Developer documentation: provide suitable documentation
V.                  Set created date time and modified date time properties to true

Methods

Create
a.       find
b.       exists
c.       logic for obtaining new ImportId
d.       initValue:
this.Status = DEMProcessStatus::created;
this.ImportId = //call the Import Id logic defined in step ‘c’ above

Keys

ImportIdIdx:
Field: Import Id
Allow duplicate: false
Set table Clustered Index and Surrogate key = ImportIdIdx

Step 2: method for validating

Create a class: DEMValidateImportGRN.

Validation Method:

Public static Boolean validatePurchaseOrder(PurchId _purchId, ItemId _itemId, Qty _qty)
{
              If (!PurchTable::exists(_purhId))
                             checkFailed (strfmt(“PurchId %1 does not exist”, _purchId));
              PurchLine purchLine;
              Select firstonly RecId, Qty from purchLine
                             Where purchLine.purchId == _purchId
                             && purchLine.ItemId == _itemId;
              If (! purchLine.RecId)
                             checkFailed(strfmt(“Purchase order %1 does not have %2 item”, _purchId, _itemId));
              if (purchLine.Qty > _qty)
{
              checkFailed(strfmt(“Insufficient inventory for Purchase order %1 does not have %2 item”, _purchId, _itemId));  
}
}

Defining the action:

You can select/create an existing/new entity à create a new method like this:
[SysODataActionAttribute(‘validatePurchaseOrder’, true)]
public static boolean validatePurchaseOrder’ (PurchId _purchId, ItemId _itemId, Qty _qty)

{
              return DEMValidateImportGRN:: validatePurchaseOrder(_purchId, _itemId, _qty);
}

Step 3: populate the staging table for GRN post

In the same class as above, create a method to populate staging table:
Public static Boolean populateStaging(PurchId _purchId, ItemId _itemId, Qty _qty)
{
              DEMImportPurchaseGRNTable purchGRNTable;
              try
              {
                             purchGRNTable.initValue();
                             purchGRNTable.PurchId = _purchId;
                             purchGRNTable.Qty = _qty;
                             purchGRNTable.ItemId = _itemId;
purchGRNTable.LineNum = //create your own logic for populating line num; as a reference, please see the PurchLine à new line num logic.
purchGRNTable.insert();
return true;
              }
              Catch(exception::Ex)
              {
                             Return checkFailed(“insert failed for GRN import”);
}

}

Step 4: the Azure API

Open your https://portal.azure.com à search for available API management services. Here, you can create your own API or use an already existing one.

Add new service operation:

Validate purchase Order: validate:


Click on query and add parameters:



Click on save.
Click on outbound processing à and add the policies:
<inbound>
<set-variable name=”purchId” value=”@(context.Request.Url.Query.GetValueOrdefault(“purchId”))”>
<set-variable name=”itemId” value=”@(context.Request.Url.Query.GetValueOrdefault(“itemId”))”>
<set-variable name=”qty” value=”@(context.Request.Url.Query.GetValueOrdefault(“qty”))”>
</inbound>
The above logic obtains the supplied ItemId, PurchId and qty from query parameters.
Now we will call the logic append pass the above parameters:


Inside the <set_url> you can actually set the url of your logic app.
See how you are creating a JSON object on the fly and passing on as a response, like this:
{
“purchId”: “P000111”,
“itemId”:”It00001”,
“qty”: 5
}

Step 5: logic app to call your custom code:

You can create a Logic app like this (for more details, refer to the the previous blog of logic app integration)
a.       Select the following trigger: “when a http request is received”.
b.       Select the above template for JSON request body
{
“purchId”: “P000111”,
“itemId”:”It00001”,
“qty”: 5
}

c.       Add action: D365 for F&O as an action. Go to connections and select your desired D365 instance.



d.       Select ‘Compose’ – a JSON template as a next action:



In the inputs à create a new JSON as:
{
“Result”: ***Bring in the value output from above action***
}

e.       As a last step create a response action, that prints the result:

Conclusion

Azure API is a vast ocean of opportunities letting you connect to any third party services. Please consider the following points while using Azure API
a.       If you want to use a third party connecter that uses a real time integration, Azure API is the best option
b.      If you want to use a low volume low volume- low frequency connectivity even then also Azure APIs are a good solution
c.       If you wish to use a high volume, high frequency – then unfortunately Azure APIs are not a good solution. In all such cases, its better to use a azure storage which could be read by a D365 batch class. Or you can also try Cosmos DB/Cassandra or other data structures which could be made to talk to D365 by using DB triggers. Will cover the same on another blog.



Comments

*This post is locked for comments