Skip to main content

Notifications

How to: Import Bank statements from Azure file storage to Dynamics 365 finance and operation using Power automate (formally flow)

We had a requirement for importing of bank statements in Dynamics 365 Finance and Operations using integration. We achieved this using the recurring data jobs in D 365 and Microsoft Power automate.

There are basically 3 parts to all the configuration

1. Azure app for authentication.
2. Data processing group and recurring data job in D 365
3. Power automate app for the integration


1. Azure app for authentication

Create an Azure app for authentication as follows:
   
Search for App registration on https://portal.azure.com/



Create a new app:





Copy the app id, we will need this for configuration in D 365 and flow.


Create a new secret and copy it.







Copy the value of the secret.

Now we have all the required information from the Azure portal. Let us switch to Dynamics 365 F&O.

2. Data processing group and recurring data job in D 365 F&O

Configure the Bank statement import setup using the steps mentioned in https://docs.microsoft.com/en-us/dynamics365/finance/cash-bank-management/set-up-advanced-bank-reconciliation-import-process

Test the import of the statement manually and make sure it imports correctly. It should create a new processing group in data management.







Go to Manage -> Manage recurring data jobs in the top ribbon and create a new recurring job as follows:



add the application id from the Azure app created in step 1. Enable it. Enable the recurring job and set recurrence as per the need.

Copy the ID of the app. we will need this for configuration.  Now we are done with setup in D 365 F&O. Time to go to Power automate.


3. Power automate app for integration.

Jump to https://flow.microsoft.com/

Create a new app using a scheduled flow template. you can change the trigger as per your need.

We need to add multiple controls to flow. the end result looks like below:




Of course, it can be made as per the requirement. I am not an expert in Flow so anybody feels this can be done in a better way, let me know :)

I will explain each of the above steps in details:

1. Recurrence:

This is the trigger point for Flow, it can be instant, some external trigger or scheduled as my case running every 6 hours.



2. List files 

I am reading the files from Azure file storage:

Give the folder name where the bank statement file exists.



3. For each file add .CSV extension

Bank files I received didn't have any extension so when flow reads the file without an extension it reads as octet-stream. But the content is supposed to be read as CSV, so I am adding .csv extension to each file and moving to another folder.



4. List newly created files for CSV content.



5. Read the CSVcontent and send to Dynamics 365 F&O



For each file read the content, send the HTTP request (details about this below), Copy the files from the in-process folder to the processed folder for the history. Delete the in-process file.

Below are the HTTP request details:



URI: This is the URL generated like below:

<base URL for Dynamics>/api/connector/enqueue/{<ID from the data processing group>}?entity=Bank%20statements&company=<companyID>

Body: is the actual content of the CSV file sent as XML. Adding "<Batch><![CDATA[" before the file content and "]]></Batch>" after is the the most import. If you this is not done then you get the error as "data at the root level is invalid. line 1, position 1"  while importing the file.
This is because the bank statement import process uses XSLT for the transformation of the files and it reads the CSV data as an XML node. When you manually import the bank statement, Dynamics does the same thing before sending it to data processing group for import.

Authentication: Active Directory OAuth

Tenant: tenant of the AOS like contoso.com

Audience: Base URL of the Dynamics like https://contoso.operations.dynamics.com

Client ID:  This is the app id created in step 1 and the app id which is set in recurring data job.

Secret: This is the key value copied from the Azure app.





 




This was originally posted here.

Comments

*This post is locked for comments