web
You’re offline. This is a read only version of the page.
close
Skip to main content
Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Suggested answer

Data Management Framework Export Limit

(0) ShareShare
ReportReport
Posted on by

I am trying to create an interface that can be supplied to multiple customers, without having to make any changes to it.

As part of this interface, the Data Management Framework is used to export the data that we required. A customer is already using the interface, but when it has been given to a new customer, they have hit the export limit.  They have tried to export the file and it is 1.6gb, where the limit is 256mb.

I have explored changing export formats, which have reduced the size of the file considerably.

I am reluctant to use the filters to specify certain sections of data that need to be exported at a time, as this will require different configuration per customer, and also an understanding of the customer's data.

I am not able to use the change tracking feature either, at this point in time a full export of the data set is required.

Is there a way to split the export file into equal chunks? e.g. can we export the first 10%, then the next 10% and so on and so forth.

I have the same question (0)
  • Suggested answer
    André Arnaud de Calavon Profile Picture
    299,407 Super User 2025 Season 2 on at
    RE: Data Management Framework Export Limit

    Hi Daniel,

    Can you explain why you can't handle changes only in this case? That would be the most easy solution. If you need a split in files, this is not out of the box supported in data management.

    You can consider a customization or ISV solution. E.g. we offer a Connectivity Studio add on, where you can setup a 'Split quantity'. In that case, it will create multiple files to overcome this issue.

  • Suggested answer
    Pedro Tornich Profile Picture
    955 on at
    RE: Data Management Framework Export Limit

    Hi DanKerby,

    As André said, there is no out-of-the-box functionality to split files while exporting.

    So your options are as follows:

    1. Use an ISV solution
    2. Use OData
    3. Use custom code + Logic App

    OData has limitations but you can paginate the results using the skip and keep filter options. In this case you would need to write your own middleware to consume the OData web services and save the data to files (or import it directly).

    The problem with OData is usually related to performance, since you will need several requests to get all the records you need.

    With the help of a Logic App, you can use custom code to dynamically change the filters on the data project, then you run the project. The logic app would repeat this as many times as needed.

    I had to do something like that once. We created a logic app that changed the data project ranges dynamically, exported the file and then put it in a SFTP folder.

    In your case the logic app would calculate the number of records you need to export and then slice by RecId ranges like "0..5654367832", then"5654367833..56543786452" and so on.

  • Community Member Profile Picture
    on at
    RE: Data Management Framework Export Limit

    Hi André,

    We could handle the changes to include filters, however it doesn't feel like a solution that best fit what we were trying to achieve.  Our ideal is that we can give this interface to a customer, and have them up and running with our application in a very short time.  Understanding their data and having filters put in place could increase the time it takes to have a customer up and running.  Additionally, we were providing the export projects as templates to the customer.  If we have to use filters, templates are no longer applicable.

    That being said, if we cannot split the file evenly, then maybe filters would be the only way around this without ISV solutions.  

    Thank you for your advice!

    Kind regards,

    Dan

  • Community Member Profile Picture
    on at
    RE: Data Management Framework Export Limit

    Hi Pedro,

    That's interesting what you say about the logic apps.  We initially had the data entity framework demonstrated to us with a logic app, but the assumption was this didn't give any additional functionality over writing our own service.  I know there was resistance when I initially mentioned the logic apps, due to being able to write the logic ourselves, however based on your comments I will do some more research.  

    Do you have any material that you could recommend that might help guide me to this solution?

    Unfortunately, I believe both ISV and OData are not options.  OData in-particular as we are exporting large volumes of data.  On the customer where this issue surfaced they were trying to export well in excess of 1,000,000 records.

    Thank you very much.

    Kind regards,

    Dan

  • Suggested answer
    Pedro Tornich Profile Picture
    955 on at
    RE: Data Management Framework Export Limit

    Hi Dan,

    Note that this is a hybrid solution between OData and DMF, because the logic app (or your code) will use OData to get the RecIds that will then be used to slice the exportation.

    To filter your entity by RecId, the entity must have the underlining table RecId as a field, for example, to filter the Customers entity by RecId you must add the CustTable.RecId field to it.

    So, maybe you choose to use other fields instead of the RecId (the AccountNum for example) so you don't need to alter your data entities, but that's up to you.

    In my particular case I had to export inventory transactions, so we sliced by Item.

    First we created a new data entity to accommodate all the fields we needed, then we added the following action to it:

    [SysODataActionAttribute("AddItemRange", false)]
    public static str addItemRange(
        DMFDefinitionGroupName _definitionGroupId,
        DMFEntityName _entityName,
        str _items)
    {
        str queryStmt;
        DMFDefinitionGroupEntity dmfDefGroupEntity = DMFDefinitionGroupEntity::find(_definitionGroupId, _entityName, true);
    
        var queryData = dmfDefGroupEntity.QueryData;
    
        if (!queryData)
        {
            queryData = DMFUtil::getDefaultQueryForEntity(dmfDefGroupEntity.Entity, dmfDefGroupEntity.DefinitionGroup);
        }
    
        if(queryData != connull())
        {
            Query query = new Query(queryData);
            QueryBuildDataSource primaryDS = query.dataSourceNo(1);
            queryStmt = primaryDS.toString();
    
            QueryBuildRange range = SysQuery::findOrCreateRange(primaryDS, fieldNum(MyCustomEntity, ItemNumber));
            range.value(_items);
    
            if (query.getSQLStatement(true))
            {
                QueryRun qRun = new QueryRun(query);
                queryData = qRun.pack();
    
                ttsbegin;
                dmfDefGroupEntity.QueryData = queryData;
                dmfDefGroupEntity.write();
                ttscommit;
    
                queryStmt = primaryDS.toString();
            }
        }
    
        return queryStmt;
    }
    

    OData actions are methods written in X and exposed by the OData framework, you just need to decorate it with the SysODataActionAttribute and the action will show on logic apps.

    This particular action is a static method and could be part of any entity. Like if you have to filter different entities you don't need to add a method to each entity, just place the action in a more generic entity (i.e. DataManagementDefinitionGroupEntity) and receive the field name as a parameter.

    This method receives the Data Project name, the Entity Name and a list of item numbers separated by comma.

    Here is how our logic app worked:

    1. Declare a variable to hold the Data Project name
    2. Declare a variable to hold the Entity name
    3. Loop through the items entity to get all items we should export
    4. Concatenate the item numbers in a string separating them by comma
    5. Call the custom action mentioned above
    6. Call the ExportToPackage action from the DataManagementDefinitionGroupEntity

    Here is the first link you should look at to learn more about logic apps. This is Microsoft official documentation on Logic Apps, no need for a deep dive, but it's a good place to start and get the overview: https://docs.microsoft.com/en-us/azure/logic-apps/

    Once you get started look at the following link, this is an example on how to build file based integrations using logic apps: https://github.com/microsoft/Dynamics-AX-Integration/wiki/File-based-integration-using-Logic-Apps

    By experience I would say that the best way to learn logic apps is by doing. Logic apps are really simple and user friendly.

    If you have a complex solution, you better write your own code, but for complementary solutions like this I rather use logic apps. If you don't have trillions of calls they can be inexpensive. And if you don't have a complex process they can also be easy to maintain.

  • André Arnaud de Calavon Profile Picture
    299,407 Super User 2025 Season 2 on at
    RE: Data Management Framework Export Limit

    HI Dan,

    Do you mean to say, you need in fact a subset to have all relevant data in one file? So, other customers would get all records if they do not hit the file size limit and this one can apply a filter to leave out irrelevant records? If so, what is the exact need for records in this file? I would like to understand your integration better to be able to the best recommendation for you.

  • Community Member Profile Picture
    on at
    RE: Data Management Framework Export Limit

    Hi Pedro,

    That looks like an interesting solution, thank you very much for sharing that.

    Regarding point 3, did you make multiple calls using OData to the target data entity, each time returning 10,000 records until you had the desired volume?

    We would be looking to export around 300,000 records per DMF job I would say.  I think the fewer jobs we have to initiate the better, however if that doesn't appear to be possible, we'll have to look at reducing the size of each job.

    I will have a play around with this method though, and see where I get to.  Thank you!

  • Community Member Profile Picture
    on at
    RE: Data Management Framework Export Limit

    Hi André,

    Unfortunately not.  Our application in its simplest form matches payments to invoices, and then reports those matches back to the customer's ERP.  To do this we need a two way feed, historically this has always been done with sending and receiving flat files over SFTP.

    The data that we are trying to export from Dynamics that is hitting the limit is the open items (invoice) data.  This is basically anything that is in the CustTransOpen table, as these are all invoices/documents that can be used when allocating a payment.  Everything that we are trying to export should be relevant to our application, so we can't select a subset of this, and ignore the rest.

    Hope that gives you a bit of an overview, happy to try and provide more information if needed.

  • Suggested answer
    Pedro Tornich Profile Picture
    955 on at
    RE: Data Management Framework Export Limit

    Hi Dan,

    You said you would like to export 300k records per DMF job and that the table of interest in the CustTransOpen.
    In this case you don't have a specific field you could use to filter, like the SalesId on sales orders. So you will indeed need to add the CustTransOpen.RecId to your entity. Filtering by the entity RecId field would return no results.

    Let's assume you added the CustTransOpen.RecId field to your entity and named it CustTransOpenRecId.

    Then you could use $skip and $top URL options to get only the RecIds you need, also you can use the $select option to return only the CustTransOpenRecId field.

    Here is an example URL that will return the first 30 customers, but only the customer account field will be fetched:

    [Your env root URL]/data/Customers?$skip=0&$top=30&orderby=CustomerAccount&$select=CustomerAccount

    So, if I would like, instead on returning all 30 records I could return only the 1st and the 30th, using 2 separated OData calls, what would be much faster than returning all records contained in the interval.

    URL for 1st CustomerAccount:

    [Your env root URL]/data/Customers?$skip=0&$top=1&orderby=CustomerAccount&$select=CustomerAccount

    URL for 30th CustomerAccount:

    [Your env root URL]/data/Customers?$skip=29&$top=1&orderby=CustomerAccount&$select=CustomerAccount

    You will also need to know the total number of records in your entity to loop until you hit the end, so here is an example URL on how to count entity records:

    [Your env root URL]/data/Customers?$top=0&$count=true

    Note that using the $count option can be slow, so you may consider creating an action on your entity that would do the counting in X++ and directly return the result to you.

    Then, in your logic app you would need a loop to get each range and start each DMF job. Your logic app would look like this:

    1. Declare variable to hold the chunk size (300k in your case)
    2. Declare variable to hold the skip number (starting with zero)
    3. Declare variable to hold the total number of records (let's call it remainingRecords)
    4. Create an Until loop and check if the remainingRecords variable is greater than zero
    5. Get the start RecId using OData call
    6. Increase the skip variable by the chunk size minus one (it should be 299,999 on the second loop for example)
    7. Get the end RecId using OData call
    8. Use the custom action I've mentioned in the other reply to change the data project query
    9. Start the DMF job
    10. Wait for the DMF job to finish with success (otherwise terminate the execution and email someone)
    11. Save the file to SFTP
    12. Decrease the remainingRecords variable by the chunk size

    Note: in #6, you need to add a condition to check if the remainingRecords variable is less than the chunk size and if it is then you need to increment the skip variable by the remainingRecords (minus one) instead of using the chunk size.

  • Suggested answer
    Pedro Tornich Profile Picture
    955 on at
    RE: Data Management Framework Export Limit

    You could also have all this done within D365FO.

    You can create a SysOperation class that would have the same logic as described in the logic app scenario. With this solution you also would be able to schedule the execution as needed (i.e. once a day).

    The SysOperation service class would get the RecId range, change the DMF project query and start the exportation job, wait for the job execution to finish and then send the file to SFTP using a third party DLL.

    Just like in the logic app scenario, the aforementioned logic would be within a loop to export as many files as necessary.

    Here is a link with an example on how to send files to SFTP directly from D365FO:

    stoneridgesoftware.com/.../

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Abhilash Warrier – Community Spotlight

We are honored to recognize Abhilash Warrier as our Community Spotlight honoree for…

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
CA Neeraj Kumar Profile Picture

CA Neeraj Kumar 1,933

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 793 Super User 2025 Season 2

#3
Sohaib Cheema Profile Picture

Sohaib Cheema 537 User Group Leader

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans