web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

No record found.

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Unanswered

Implement event driven design for recurring integration on d365 FO

(0) ShareShare
ReportReport
Posted on by 55

Dear Experts,

Idea is to import a file through recurring integration from Azure storage blob -->  process/ them(post and transfer) -> log error to custom dataEntity and sent to relevant external systems, if any. We intend to design an individual process instead of creating individual batch job for posting imported files and another for capturing error and sending them across.

So far we have identified that when a recurring job is created two batch jobs gets created in the background by class:

- SysIntegrationActivityBatch

- SysIntegrationActivityMonitoringBatchTask

Can you please suggest how an event driven architecture can be implemented so that the imported files can be processed on completion of  import and subsequently errors are logged and sent back to external system. As per the licensing bought we can only use Power automate. 

Best regards,

piku

I have the same question (0)
  • Satish Panwar Profile Picture
    14,671 Moderator on at

    Hi Piku,

    take a look at data management package API, you can trigger data export / import using power automate. no need to setup batch jobs in AX.

    docs.microsoft.com/.../data-management-api

  • Sergei Minozhenko Profile Picture
    23,093 on at

    Hi Piku,

    You can use Data package API or oData (depends on your requirements) for import.

    If you need additional processing like general journal posting you can implement oData action endpoint or web service to initiate the process as soon as data will be imported.

    For errors, you can use a response from oData action or web service (synchronous way) or business events + oData read call\data package API (asynchronous way)

    BR, Sergey

  • Satish Panwar Profile Picture
    14,671 Moderator on at

    Data Package API is asynchronous but can be triggered as you mentioned in your question above.

  • piku Profile Picture
    55 on at

    Dear Sergey,

    Appreciate the response, however we want to drive the events after import within d365fo and without any external system's intervention.

    As you mentioned "implement oData action endpoint or web service to initiate the process as soon as data will be imported." We require call from external system which in our case we don't want to and we are using recurring integration.

    Regards.

  • nmaenpaa Profile Picture
    101,160 Moderator on at

    Are you sure you want to use Recurring integration in this case? Why not make an OData request? Or use custom service?

  • piku Profile Picture
    55 on at

    Dear Nikoloas,

    Thanks for the response.

    Reason for RI is to pick data periodically from messaging queue and also to make use of standard data entities. After import we want to trigger process for further posting of data and logging error also within d365 and without using middle ware or call from external systems.

    Another approach was to create separate batch jobs for processing imported data and more batch job for recording erros.

    I'm relatively new to d365 fo env and right now figuring out the best design.

  • Sergei Minozhenko Profile Picture
    23,093 on at

    Hi Piku

    Is there a requirement to pick data periodically? Because with data package API you can import data and process it (copy from staging to target table) at the same call without additional batch jobs and you can use any entity as well. With OData you can also use standard entities that are available for oData.

    If you want to trigger the posting of imported data after import happens inside F&O you will need to program it (for example monitor status of DMFDefinitionGroupExecution records). If will go in this way, I suggest you will not implement logic directly, but create a one-time batch job instead to reduce DMF transaction time.

    About error export: if you want to notify external system that there are some issues - it's business events. If you just want periodically download errors from F&O, you can use data package API, RI or oData as well (but you would need to implement oData filter correctly to get only new errors and in this case, BE+oData is the best option as you will know processed document id and filter oData call by this Id)

    Please keep in mind that RI is using batch processing and if you set recurrence to run job to often like every 10 minutes, but you will import\data only several times per day if it will give unnecessary overhead and in this case, data package API  or oData (depends on amount of data to be imported or exported) is a better choice.

    BR, Sergey

  • Nadia Borshek Profile Picture
    25 on at

    Hello Dears,

    I created a batch with the following classes

    - SysIntegrationActivityBatch

    - SysIntegrationActivityMonitoringBatchTask

    but I am not able to delete the job batches,

    any advise please?

    Best Regards,

    Nadia

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Martin Dráb Profile Picture

Martin Dráb 551 Most Valuable Professional

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 450 Super User 2025 Season 2

#3
BillurSamdancioglu Profile Picture

BillurSamdancioglu 278 Most Valuable Professional

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans