web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

No record found.

News and Announcements icon
Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Suggested Answer

Is API fit for initial master data ?

(1) ShareShare
ReportReport
Posted on by 596
Hi guys,
 
If we have an external system outside D365 FO, and this external system would need master data from F&O, is API also fit for inserting this master data ? For example they need Item master in their system, and in D365 FO we have >5k records of items, should we still use API for this initial data or should use export and import instead ?
 
Thanks,
I have the same question (0)
  • Suggested answer
    Martin Dráb Profile Picture
    238,769 Most Valuable Professional on at
    You can use web services for data migration as well, but data export/import is typically faster and doesn't require development and an app registration in AAD.
  • Voltes Profile Picture
    596 on at
    Hi Martin,
     
    Will we face long process time then ? will then be time out ?
     
    Thanks
  • Martin Dráb Profile Picture
    238,769 Most Valuable Professional on at
    It's difficult to answer, because I don't know what kind of data you'll use, which API you'll use, how you'll will implement the solution, how much time the other system will need to process the data and so on.
     
    I guess the answer is: it may time out if your implementation isn't suitable for the task. For example, it may time out of you try to load 5000 records at once, but it won't if you split the work to smaller pieces.
  • Voltes Profile Picture
    596 on at
    Hi Martin,
     
    May I know how to "split the work" ? is this still under same API ?
    any example in that matter ?
    Thanks.
  • Suggested answer
    Kevin Xia Profile Picture
    Microsoft Employee on at
    Hi,
    I think which situation is more appropriate needs to be considered according to your actual scenario. The advantage of using the API to initialize the data is that the API can format the initial data more easily. With this API, you can trigger API requests periodically or in real time as needed to keep your data up to date. Of course, there are also benefits to using manual import and export, which does not require you to develop the required API. Additionally, use export and import makes it easier to spot and fix potential data issues by making clear visualizations by manually listing or editing data.
    I think this depends on whether your external system still needs to synchronize the master data in the future. If the external system often needs to synchronize master data in the future, I will choose to develop and use the API.
    Best regards,
    Kevin
  • Voltes Profile Picture
    596 on at
    Hi,
     
    The external system will need to synchronize the master data in the future. But I guess, it will be on different API to update (incremental) and API for upload the 1st initial data (which has >5k records), can I say that? However, based on the input, I am now understand, it may be more correct for initial to just export and import (bulk).
    Also, may I know what and how is the "split the work" means and works ? just in case we do decided to still using API, and is there a good example for that ?
     
    Thanks
  • Layan Jwei Profile Picture
    8,165 Super User 2026 Season 1 on at
    Hi Voltes,
     
    By API do you mean Odata?
     
    You should consider using Odata if you want data in real time and if the volume of data is not large.
     
    **Microsoft defines large volume as if it's more than few hundred thousands records**
     
    Looking at your example, 5k records is eligible for Odata but now you need to ask yourself, do you want it in real time or not?
     
    For large volume of data, you should consider Batch data APIs (recurring integrations or data management package REST Api)
     
    **By slicing it means to not return the 5k records at once, you could for example try and return 1k records per call. For example In Odata there is something called pagination
  • Voltes Profile Picture
    596 on at
    Hi,
     
    Thanks for the guidance. For OData with pagination, is it by adding parameters "count" and "maxpagesize" is enough ? So, I'm trying like this with Postman->
     
    {{baseUrl}}/data/VendorsV2?cross-company=true&$filter=dataAreaId eq 'TST' &$select=VendorAccountNumber,VendorOrganizationName&count=true&maxpagesize=5
     
    I tried this twice, 1st with maxpagesize=200, and 2nd time with maxpagesize=5. The different is with the 1st one took 1minutes 23second, and the 2nd took 56second. But I'm also not sure that this I retrieve the whole data? Is there a way to check this JSON result data as a table so we can easily examine the data ?
     
    I'm actually looking at this web resource : https://learn.microsoft.com/en-us/power-apps/developer/data-platform/webapi/web-api-query-data-sample#bkmk_filterPagination, but I feel not confident whether I 'm doing it right.
     
    Thanks again.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Stars!

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 509 Super User 2026 Season 1

#2
Giorgio Bonacorsi Profile Picture

Giorgio Bonacorsi 375

#3
Adis Profile Picture

Adis 268 Super User 2026 Season 1

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans