Hello everyone,
I have a custom table that is used to store attachments that is related to an account.
The requirement is for any file that will be uploaded to this table, it should be stored in Azure Blob Storage and it should not count against dataverse file capacity storage or even setting up a SharePoint integration.
The process should as follow:
In the custom table, I have a custom field of type (File Column) and a document URL field. When the is record is created, a Power Automate workflow should be triggered in the background and do the below:
-> get the file content of the created attachment record -> create the file in azure blob -> update the record with the path link of the file in the form -> delete the file that exists in dataverse (To avoid consuming file capacity storage).
After the workflow process is finished and the user tries to access the record in edit mode, the file column will disappear (we can customize it through JS) and he will only see the document URL of this file and access it through the browser to check its content.
Could you please advise if the above is a good solution to proceed with or is there any other better approach? I am open to any suggestions.
Any help is highly appreciated.
Best regards,
EBMRay