Now Available in Community - New TechTalk Videos for 2020
2020 release wave 1 Discover the latest updates and new features to Dynamics 365 planned through September 2020
Release overview guides and videos Release Plan | Preview 2020 Release Wave 1 Timeline
Ace your Dynamics 365 deployment with packaged services delivered by expert consultants. | Explore service offerings
Connect with the ISV success team on the latest roadmap, developer tool for AppSource certification, and ISV community engagements | ISV self-service portal
The FastTrack program is designed to help you accelerate your Dynamics 365 deployment with confidence.
FastTrack Program | Finance TechTalks | Customer Engagement TechTalks | Upcoming TechTalks
Idea is to import a file through recurring integration from Azure storage blob --> process/ them(post and transfer) -> log error to custom dataEntity and sent to relevant external systems, if any. We intend to design an individual process instead of creating individual batch job for posting imported files and another for capturing error and sending them across.
So far we have identified that when a recurring job is created two batch jobs gets created in the background by class:
Can you please suggest how an event driven architecture can be implemented so that the imported files can be processed on completion of import and subsequently errors are logged and sent back to external system. As per the licensing bought we can only use Power automate.
take a look at data management package API, you can trigger data export / import using power automate. no need to setup batch jobs in AX.
You can use Data package API or oData (depends on your requirements) for import.
If you need additional processing like general journal posting you can implement oData action endpoint or web service to initiate the process as soon as data will be imported.
For errors, you can use a response from oData action or web service (synchronous way) or business events + oData read call\data package API (asynchronous way)
Data Package API is asynchronous but can be triggered as you mentioned in your question above.
Appreciate the response, however we want to drive the events after import within d365fo and without any external system's intervention.
As you mentioned "implement oData action endpoint or web service to initiate the process as soon as data will be imported." We require call from external system which in our case we don't want to and we are using recurring integration.
Are you sure you want to use Recurring integration in this case? Why not make an OData request? Or use custom service?
Thanks for the response.
Reason for RI is to pick data periodically from messaging queue and also to make use of standard data entities. After import we want to trigger process for further posting of data and logging error also within d365 and without using middle ware or call from external systems.
Another approach was to create separate batch jobs for processing imported data and more batch job for recording erros.
I'm relatively new to d365 fo env and right now figuring out the best design.
Is there a requirement to pick data periodically? Because with data package API you can import data and process it (copy from staging to target table) at the same call without additional batch jobs and you can use any entity as well. With OData you can also use standard entities that are available for oData.
If you want to trigger the posting of imported data after import happens inside F&O you will need to program it (for example monitor status of DMFDefinitionGroupExecution records). If will go in this way, I suggest you will not implement logic directly, but create a one-time batch job instead to reduce DMF transaction time.
About error export: if you want to notify external system that there are some issues - it's business events. If you just want periodically download errors from F&O, you can use data package API, RI or oData as well (but you would need to implement oData filter correctly to get only new errors and in this case, BE+oData is the best option as you will know processed document id and filter oData call by this Id)
Please keep in mind that RI is using batch processing and if you set recurrence to run job to often like every 10 minutes, but you will import\data only several times per day if it will give unnecessary overhead and in this case, data package API or oData (depends on amount of data to be imported or exported) is a better choice.
Business Applications communities