Dear Experts,
Idea is to import a file through recurring integration from Azure storage blob --> process/ them(post and transfer) -> log error to custom dataEntity and sent to relevant external systems, if any. We intend to design an individual process instead of creating individual batch job for posting imported files and another for capturing error and sending them across.
So far we have identified that when a recurring job is created two batch jobs gets created in the background by class:
- SysIntegrationActivityBatch
- SysIntegrationActivityMonitoringBatchTask
Can you please suggest how an event driven architecture can be implemented so that the imported files can be processed on completion of import and subsequently errors are logged and sent back to external system. As per the licensing bought we can only use Power automate.
Best regards,
piku