Personalized Community is here!
Quickly customize your community to find the content you seek.
Check out the latest Business Central updates!Learn about the key capabilities and features of Dynamics 365 Business Central and experience some of the new features.
Download overview guide | Watch Business Central video
2021 Release Wave 2Discover the latest updates and new features releasing from October 2021 through March 2022.
2021 release wave 2 plan
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Community | FastTrack Program | Finance and Operations TechTalks | Customer Engagement TechTalks | Upcoming TechTalks | All TechTalks
As most probably know, it is not possible to access the file system while in Business Central cloud environment.
For example, in Dynamics NAV, we could have a job queue entry that, when run, creates a file and copies it in a network folder. We can still do that in an On-Premise environment, but not with cloud BC.
You could create the file and use DownloadFromStream, but that would only prompt you do download it locally, but would not copy it somewhere on a local or network folder.
If you try to use File.Create() you would get the warning: “The type or method ‘Create’ cannot be used for ‘Extension’ development”.
If your customer is happy to grab the file manually every time from the downloads folder then this should suffice:
But, if we want to automatize this process and run the extract on a regular basis, we need to find a cloud solution for storing the files.
Currently, there are 4 types of storage in Azure platform:
In my previous blog I dived into the Azure Storage of type Tables and tackled its API.
This blog is about interacting with the Azure storage blob containers:
I found on Michael Megel’s blog a nice solution for exactly what I need. Awesome job on Blob Containers API, Michael! Thank you for sharing!
What I need:
Blob Container Setup
To set up a container, following Michael’s notes on above blog was enough for me.
For blob container accessibility I went on the path of shared access signature “SAS Token”.
Once created, you can start playing with the storage account container API.
I created the storage manually:
Drilling down into the storage account, I created a new container:
In VS Code, using Rest Client,
GET https://svflorida.blob.core.windows.net/?comp=list&%5Bhere you insert your SAS token key]
HTTP/1.1 201 Created
Last-Modified: Wed, 18 Aug 2021 19:05:13 GMT
Server: Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0
Date: Wed, 18 Aug 2021 19:05:13 GMT
2. I sent a PUT request to insert an empty file:
PUT https://svflorida.blob.core.windows.net/vendorlist/vl1111?%5Byour SAS token here]
Last-Modified: Wed, 18 Aug 2021 19:23:46 GMT
x-ms-request-server-encrypted: true Date: Wed, 18 Aug 2021 19:23:46 GMT
And this is the file in Azure portal:
Business Central extension:
This is how the new setup table “Azure Storage Setup” looks in BC:
This is how the new BC interface “Vendors Export Log” looks like:
“Write File In Azure” action on page 50251 “Vendor Export Log” does the following:
Consult Blobs with Excel:
BC users can click on the URL link above and download locally the file or they, and other 3rd party users, can access the files via Excel, as I explained in my previous blog.
This time though, when creating the connection choose Data – > Get Data -> From Azure -> From Azure Bob Storage.
And finally displayed in the Excel book:
Get Azure Blobs locally
To help with getting the files locally, I wrote 2 blogs:
For more about storage accounts in Azure check this out.
You can find sample code repository here.
Business Applications communities