web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Supply chain | Supply Chain Management, Commerce
Suggested Answer

Bulk Export of data with item and its attributes

(0) ShareShare
ReportReport
Posted on by 288
Hi,
 
We have a client where we have Engineering Change Management activated and have around 200k records and almost each item has 50 - 80 attributes for each item.
 
Considering huge volume of data, When tried exporting the item details with the attributes and its values through data entity "Released engineering product version attribute values" it gets failed when exporting it through excel or csv as it crosses excel rows n columns limit of 35,00,000 records.
 
What would be the easiest way to export them to achieve client requirement.
 
Thanks.
I have the same question (0)
  • Suggested answer
    Navneeth Nagrajan Profile Picture
    2,450 Super User 2025 Season 2 on at
    Hi,
     
    A few questions:
    1. Are you using a custom data entity or a standard data entity in this case?
     
    A few suggestions:
    1. For a scenario with 200K engineering products and 50-80 attributes each, exporting through an Excel or a CSV will fail because excel has a limitation of 1,048,576 rows and in addition to this, there are column level limitations too.
    2. Some approaches for handling such heavy data is to following either of the following:
        a. Azure Synapse Link for Fabric - The latest and a supported mechanism to showcase data from D365 SCM.
        b. Other approaches could be run as batch option from Data Management framework (DMF) and setting the destination as a SQL Azure database like a BYOD (Bring Your Own Database) approach and access the exported data. This approach still exists but is an obsolete method to pull data onto reports. You can pull data from the Azure SQL Database onto a PowerBI report or an Excel file and then use PowerQuery on top of the dataset to model and shape the data, as per customer's requirements. Ensure that the database has sufficient storage like 100-250 DTU based or vCore scaling methods needed to address the large data volume. (Referenceshttps://learn.microsoft.com/en-us/dynamics365/fin-ops-core/dev-itpro/analytics/export-entities-to-your-own-database)
       c. The other approach is to export the data from DMF into an Azure blob storage as a XML and then get that transformed into a csv file at the Azure middle ware using Azure Logic Apps and share the csv data to the customer post export of the csv file to the blob storage. (References: https://community.dynamics.com/blogs/post/?postid=1ed55df1-3343-4f14-9cbc-2591db38d8c8)
     
    Hope this helps. Happy to answer questions, if any.
     
     
     
     

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

News and Announcements

Season of Giving Solutions is Here!

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Neeraj Kumar – Community Spotlight

We are honored to recognize Neeraj Kumar as our Community Spotlight honoree for…

Leaderboard > Supply chain | Supply Chain Management, Commerce

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 201 Super User 2025 Season 2

#2
Laurens vd Tang Profile Picture

Laurens vd Tang 187 Super User 2025 Season 2

#3
Sagar Suman Profile Picture

Sagar Suman 97 Super User 2025 Season 2

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans