Announcements
Hi,
I am new to D365FO. I want to create Import project to import Customer entity data from datapackage. When I click on Create Recurring data job, after entering Name, description and batch group. Click Save it gives error message:
At least one client id must be enabled for the recurring data job before it can be activated.
Intention is run import job, on a recurring schedule to import from specified local folder.
Still we need to specify "Application ID" in "Setup authorisation policy"?
Looking for your guidance.
Thanks.
Thanks Frank.
Transaction volume does also matter. Anything oData is not recommended for high volume transactions (10000s/day). DMF is more suited to handle high volume.
Great insights. Thanks a lot Martin.
If you call the same API in the same way, it doesn't matter much from where you do it. Note that the F&O connector in Logic Apps is based on OData service, which you most likely don't want to use. You want to use one of those data management APIs.
Logic Apps may be beneficial if you need additional logic - it may be easier to set it up in Logic Apps than developing it by yourself elsewhere. But as always - it depends.
Thanks a lot Frank Bruemmer & Martin Dráb.
Your explanation and confirmations made things so clear. Great support from seniors here.
Thanks Martin. Just one more point.
For the Data load activity of migration, using Logic Apps is clear no? or both RIS and Logic Apps have equal importance. please confirm.
Yes, RIS still can be used. Details depends on your requirements - you can map Azure File Storage as a drive, you can copy files locally by a script or your can modify RIS to get data from any kind of storage you like.
Logic Apps can be used too.
Thanks Martin.
[quote]There are two APIs for file imports. One, the older, merely uploads the file and waits for a job in AX to pick it up and do the import. If you use this older API, you must set up the batch in AX, otherwise nothing would be imported.
The newer API assumes that scheduling is done outside the ERP system. When you import a file, it's imported straight away. It means that you don't need any batch in AX, but you must do some kind of scheduling for calling the API.
The Recurring Integrations Scheduler application supports both APIs, therefore you can decide which one you want to use. Depending on your choice, you need or don't need to set up the batch in AX.
[/quote]Am I right here - the two APIs you mentioned are - Batch data APIs (Recurring integrations & Data package API) right?
as per this lik - docs.microsoft.com/.../integration-overview
If the situation is such that application that is creating source files to be imported in D365FO, is writing to a folder/location that is in Azure Data Factory/Azure Storage, then RIS can still be used? or LogicApps should be used which will call Data package API? Please suggest. Waiting to know your suggestions.
There is nothing in F&O itself for reading from generic folders - that's what RIS is for.
F&O exposes APIs, but you need something calling these APIs and Microsoft provided RIS as an example how you can implement an application reading files and calling these APIs. As mentioned, there are two APIs for this purpose, both supported by RIS. By the way, RIS has a wiki which may answer some of your questions.
Thanks - Frank Bruemmer - for explanation and your patience.
I get it now, for recurring DMF - Data Import projects, Recurring Integration Scheduler(RIS) is the way to go. As I understand, this will move files from on-premise to D365FO directly.
If the situation is such that application that is creating source files, is writing to a folder/location that is in Azure Data Factory/Azure Storage, then we should still use RIS? please suggest.
André Arnaud de Cal...
294,206
Super User 2025 Season 1
Martin Dráb
232,968
Most Valuable Professional
nmaenpaa
101,158
Moderator