Logic App for Azure SQL DB to Azure File Storage Workflow
While dealing with integrations, developers usually need to work on bulk data transfers from one system to another and when it is for remote data stores then we need to think of solutions that can work with optimal database calls.
In this post, we will integrate Azure SQL Database with Azure File Storage. We will extract the bulk data from the database and save it to file storage in the comma-separated format (CSV) so that it can be used for further processing.
This sample interface uses Azure SQL Database, we created as part of BYOD setup that stores data from Microsoft Dynamics 365 Finance (D365F), like purchase orders. Once data is in Azure SQL, we will use Azure Logic Apps to export the bulk data on daily basis to an Azure File Storage in CSV format.
Also read: Azure Data Factory for Azure SQL DB to Azure File Storage Pipeline
First, let’s build a simplified version of the interface in this post, and then we can move to a more optimized version that uses SQL pagination.
Pagination is the process of dividing the transaction result set into discrete subsets.
Logic App
Workflow looks like this.
- Recurrence trigger is configured to run once in a day.
- File name is configured to be unique for that day.
- For a complex SQL query, you can create a View and call the records using Get rows action.
- A condition is setup to check if Get row returns any record.
- If records are returned, we create and upload the CSV file to Azure File Storage.
Initialize variable uses a concat()
function to figure out the unique file name for the day and saves it into a variable called FileName.concat('PO_',utcNow('yyyyMMddmmss'),'.csv')
Get rows (V2) is a SQL Server Action that will use SQL Server name, Database name, user and password to connect to the server.
RowCount is a compose action that checks the number of records returned by Get row.length(body('Get_rows_(V2)')?['value'])
Finally Create CSV table takes the Get Rows value to create the CSV table, and Create File (Azure File) takes the output from the Create CSV table to create & upload the file.
Once the workflow execution is complete, you will see the file created and uploaded to Azure File Storage.
Limitation
Get row
action has a limitation of returning 2048 records in one call. You can get over this by turning on the pagination in the action settings.
We will see a more robust example of pagination using the SQL Stored Procedure in our next post.
Also read: Azure Data Factory for Azure SQL DB to Azure File Storage Pipeline
If you liked the story and want to see more of this kind of content, please follow us on twitter @CursorRun or FB page @CursorRun
The post Logic App for Azure SQL DB to Azure File Storage Workflow appeared first on Cursor.Run.
*This post is locked for comments