Announcements
Greetings to all of you!
Since its a fresh new request for me i will try to explain what i need as good as possible.
I have a case which requires, long story short, to integrate Business Central with an individual azure sql of the customer so that i can retrieve data from this sql to business central but also push data from BC to this Azure SQL. Has anyone faced such a case so that he/she can guide me through the learning path that i need to cover?
Also , since i have very little experience regarding the Azure SQL , is the integration that i describe above, a good solution for data management? As far as i know the data within the Azure SQL of the customer, comes from txt files which are located in an ftp server. The customer basically has an automation to retrieve those txt files from an ftp server, get the data and then push them to a Azure SQL. How do you feel about that, is this integration that i am trying to achieve possible to be done?
Thank you in advance for your time and your help.
With Respect,
Antonis
Hi, just to add some info.
Current API Limits
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/api-reference/v2.0/dynamics-current-limits
OData request limits (per environment)
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/operational-limits-online#ODataServices
Hope this helps as well.
Thanks.
ZHU
1000 files per day should not be an issue at all as you handles it in smaller batches and does not try to handle all the 1000 files in one large batch.
Feel free to circle back with any more detailed questions if you run into anything during your project.
Hi Antonis - Glad we were able to assist you
As long as there are no errors (like database table locking) encountered by your AL Code (I am assuming it will be a Job queue) I do not see it can be of any issue per-se.
However, I'd suggest grab a batch of 100 files at once and then process these 100 records in your table first before moving to the next 100. This way loop your solution 10 times to see how the performance is - In my experience, you may not get a lot of errors but if things affect performance, it is as good as an error (or even worse!)
1. Keep a provision (a setup based field) such that you can increase/decrease this "batch size" as you want it
2. Check with your Azure Admin if it has enough capacity to deal with such a workload
3. Consider re-evaluating/optimizing your code and preventing roll backs in case data is written into high-volume tables (Sales Orders, Purchase Orders, etc.) - basically wherever the users may visit the most during the day is your high-volume table. If your custom code involves "posting" of any document, then you especially need to take care about the transaction table locking issue (if the posting will be undertaken during users working hours)
4. Try running your Job Queue (Which grabs the files from Azure and puts in BC) during times either when the user traction is minimal on BC or after office hours. This may not be possible in all scenarios, given when you receive the data
Lastly, it can be a bit time consuming but if you're really after making this scale-up dynamically you need to monitor where it lags - try implementing telemetry for your custom app. This way if something's taking too long, you'll get to know:
learn.microsoft.com/.../telemetry-overview
P.S: Optimizing your code can mean one or more of the below activities (non-exhaustive list)
1. Sort wherever necessary (If you know the recently created data is what you're after, sort DESCENDING on the created by D/T field)
2. Use filters tightly (If you know you want to grab Document Type = Purchase Receipt & Entry Type = Purchase, make sure to write them down in your SETRANGE, although it may seem obvious that they both would appear together)
3. Use temporary tables wherever you are intense on looping &/or validating stuff
4. Introduce error handling (You'd absolutely hate it if you were made to manually monitor the JQ everyday to see if there were errors or it was successful. Use error handling capabilities in AL to let your JQ function even if there is an error)
Best of luck :-)
Thank you both for your answers, it helped a lot.
Just fyi, i decided to use BLOB Storage via AL for the data transfering!
Last question for me.
In the project we are working at right now , there is demand of transfering from an SFTP, to BC, almost 1000 txt files ( a few KB in size ) ,throughout a day (new txt files will be coming every few minutes).
As a solution i proposed that we should host the SFTP in Azure with dedicated containers so that we can store those txt files that we will need to input within the BC. After that, through AL with codeunits and by using InStream , i will get the txt file , read line by line and then push to the dedicated custom tables that i have already developed.
Do you believe that the amount of the txt files ( 1000 per day) , will be an issue for this solution that i have proposed? Any concern based on your experience ?
Thank you in advance.
I think the most reliable approach to this would be to build Azure functions that can connect to both Business Central and Azure SQL and exchange the data needed between the two systems.
But if you do not want to go that way or do not have the competency to build whats needed for that there is a few easier alternatives.
You could export data from Business Central and place them on an Azure Blob storage in a file format like json,xml or csv and then import them to Azure SQL from that blob storage.
And you can do the same technique for exporting data from Azure SQL - place them on a Azure Blob storage and let Business Central read the data off Azure blob storage.
Deepening on what data you want to exchange with Business Central / Azure SQL and the number of tables - the effort for the last alternative should not be that big.
And I think I would go with json files because then it would be easier at a later stage to move the solution into and Azure function.
Using Azure logic appss or Power Automate might also be a possibility depending on the volume and frequency of the data exchange.
Please circle back if you need more details around any of these approaches.
Hi Antonis - What you have mentioned is a commonly occurring use case we see every now and then, and yes it is very much possible. Let me highlight one way through which this solution can be designed:
1. Create a staging table in Business Central (via AL). If you yourself are not a developer, one from your team can assist you
2. You will use Power Automate or Azure Logic apps acting as a data-transfer-tier between BC Online & the SQL DB.
This PA/ALA flow will periodically pick up files from your Azure SQL and dump the data into this staging table in BC which you just created in step 1
2. If in case your PA/ALA flow is unable to read data from inside of the file kept on the SQL server, you can absorb the individual files as is from the SQL server and place them on Azure BLOB Storage. Then use BC (AL) code to connect with this BLOB Storage and read the necessary data in (Link: www.linkedin.com/.../business-central-azure-blob-storage-bert-verbeek)
Some links:
Connecting PA with Azure Blob Storage: powerautomate.microsoft.com/.../
Connect PA with BC: learn.microsoft.com/.../index
Before you ask, here are some differences between Azure Logic Apps & Power Automate:
BC Online
The approach you need here will variate a bit depending if you are running BC Online or BC On prem.
What is the situation in your case?
André Arnaud de Cal...
294,099
Super User 2025 Season 1
Martin Dráb
232,866
Most Valuable Professional
nmaenpaa
101,158
Moderator