web
You’re offline. This is a read only version of the page.
close
Skip to main content

Announcements

No record found.

News and Announcements icon
Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Answered

Best way to generate and return large XML files without saving locally? (RunBaseBatch)

(4) ShareShare
ReportReport
Posted on by 42
Hello everybody

I am currently generating an XML file inside X++ and I save it locally before streaming it to the user.

Here is the code:

XmlDocument xmlDoc;

xmlFileName = _nameFile+'.xml'; //
xmlFilePath = System.IO.Path::GetTempPath() + xmlFileName;
 
xmlDoc.save(xmlFilePath);
 
System.IO.MemoryStream memoryStream;

memoryStream = new System.IO.MemoryStream(System.IO.File::ReadAllBytes(xmlFilePath));
 
File::SendFileToUser(memoryStream, xmlFilename);
 

This works, but:

 Problem

In the future, the generated XML files may become very large.

Saving large files on the local disk is not the best option.

My Question

Is there an alternative solution that allows me to:

  • Generate and return the XML without saving it to a local file,

  • Or store it somewhere safer

  • And still download it to the user when needed?

 

Thanks for the help.

I have the same question (0)
  • André Arnaud de Calavon Profile Picture
    303,674 Super User 2026 Season 1 on at
    Hi,
     
    Can you please clarify if your question is related to Dynamics AX (2012 or before) or Dynamics 365 F&O? You used both tags, which is contradictory. Most likely your question is about F&O, but please confirm.

    Anyway, what is the reason to save it locally? You can directly get the XML content in a memory stream.
  • Suggested answer
    Martin Dráb Profile Picture
    239,029 Most Valuable Professional on at
    What is the best option depends on what you do with the XML document. Unfortunately, you gave us no description and your code snippet is completely missing this part.
     
    If you want to keep using XmlDocument class, notice that it's a wrapper around System.Xml.XmlDocument, which has a method for saving to a Stream. You can create an extension of XmlDocument and a method exposing document.Save().
     
    Nevertheless the idea that saving large files on disk sounds very strange to me. This is what disks are for. What you should be more concerned about is keeping large files in RAM, which is normally much smaller than disk space.
  • JR-26060826-0 Profile Picture
    42 on at
    Hi @Martin Dráb and @André Arnaud de Calavon

    Thanks for your awnsers
     

    I'm working on a custom feature in Dynamics 365 Finance & Operations where I need to generate a large XML file.

    This XML is a statutory reporting file called SAF-T (Standard Audit File for Tax purposes) — in practice it's just a very big XML with accounting and financial data.

    The process builds the entire XML in memory (thousands of nodes) and the final structure is stored inside an XmlDocument variable named xmlDoc.

     

    Example (simplified):

     
    XmlDocument xmlDoc = XmlDocument::newBlank();
    XmlElement root = xmlDoc.createElement("AuditFile");
    xmlDoc.appendChild(root);

    // Many sections here...
    // Customers, Suppliers, Ledgers, Tax entries, etc.
    // potentially tens of thousands of XML elements
     

    After building the full XML, I currently save it to a local temp file:

     
    xmlDoc.save(xmlFilePath);
     

    Then I load it again into a MemoryStream just to send it to the user.

    My goal is simply to avoid writing to local disk, because:


    • the XML can become very large in production environments

    • some customers have limited disk space on the AOS

    • developer VMs often run out of space very quickly


    •  

    So I just need a way to persist or stream the XML without creating a temp file.

  • Verified answer
    Navneeth Nagrajan Profile Picture
    2,538 Super User 2026 Season 1 on at
     
    For large XML Files you can use a blob storage to host these files. In addition to this, you will need an Azure storage account with container created. You can reference the .NET Assemblies Azure.Storage.Blobs (recommended using since v40).
     
    class AzureBlobStorageHelper
    {
    /// <summary>
    /// Uploads an XML string to Azure Blob Storage using connection string (recommended)
    /// </summary>
    /// <returns>true if uploaded successfully</returns>
    public static boolean uploadFiletoBlobStorage(
    str _connectionString,
    str _containerName,
    str _blobName,
    str _xmlContent)
    {
    boolean ret = false;
     
    try
    {
    // These types are available natively in D365 FO (PU40+)
    Azure.Storage.Blobs.BlobServiceClient serviceClient;
    Azure.Storage.Blobs.BlobContainerClient containerClient;
    Azure.Storage.Blobs.BlobClient blobClient;
    System.IO.Stream stream;
    System.Text.Encoding utf8;
    System.Byte[] bytes;
     
    utf8 = System.Text.Encoding::get_UTF8();
    bytes = utf8.GetBytes(_xmlContent);
     
    // Create stream from bytes
    stream = new System.IO.MemoryStream(bytes);
     
    // Initialize clients
    serviceClient = new Azure.Storage.Blobs.BlobServiceClient(_connectionString);
    containerClient = serviceClient.GetBlobContainerClient(_containerName);
    blobClient = containerClient.GetBlobClient(_blobName);
     
    // Upload (overwrites if exists)
    blobClient.Upload(stream, true); // true = overwrite
     
    info(strFmt("XML uploaded successfully: %1/%2", _containerName, _blobName));
    ret = true;
    }
    catch (Exception::Error)
    {
    error("Failed to upload XML to blob storage. Check the connection parameters");
    }
    catch
    {
    error("<Error uploading to blob storage>");
    }
     
    return ret;
    }
     
    ///Sample
    public static void main(Args _args)
    {
    str connectiongstring = <Retrievefromparamtertable>;
    str containername =<blobstoragecontainerdetailsinparametertable>;
    str blobName = <blobnameparametertablewillhostthedetails>;
     
    str xmlContent = <xmlschema>
     
    AzureBlobStorageHelper::uploadFiletoBlobStorage(connectionstring,containername,blobname,xmlcontent);
    }
     
    Hope this helps. Happy to answer questions, if any.
  • Martin Dráb Profile Picture
    239,029 Most Valuable Professional on at
    I already gave you a solution.
     
    But your problem still make no sense to me. What size of the file you're talking about? Getting a few hundred GBs of disk space is easy and cheap. Saying that you'll rather obtain RAM of the same size sounds ridiculous. The actual problem is RAM and if you need to work with a huge amount of data, you need to split the work to smaller parts or off-load some data from RAM to... a disk!
  • André Arnaud de Calavon Profile Picture
    303,674 Super User 2026 Season 1 on at
    Hi,

    There is standard SAF-T available in Dynamics 365 Finance. Standard Audit File for Tax (SAF-T) electronic report - Finance | Dynamics 365 | Microsoft Learn
    Are you in one of the supported regions?
    If yes, are you able to use the standard?
    If not, you can check how Microsoft did build this in the standard using Electronic Reporting.
  • Navneeth Nagrajan Profile Picture
    2,538 Super User 2026 Season 1 on at
     
    For the developer VM, as Martin mentioned you can increase the disk size for hosting large files. If it is a cloud hosted VM (on your own Azure subscription with D365 FO tools) then you can increase your disk space on Azure. If it is a local VM then you can increase the disk space on your host VM and then create a shared folder to host these large XML Files there.  

    Alternatively, you can avoid writing to a local disk and write instead to a SharePoint site or to an Azure blob storage. XML files can be huge so can consider a transformation from csv to xml at the Azure logic apps level or consider using a PowerAutomate flow if cost per transaction is a challenge. Would rather have the file uploaded to a blob storage because in case of a production deployment, you will need all these configurations to be setup. The actual production level XML files won't be hosted in a local file storage. Plus, it will be cumbersome to have these XML files hosted in a local file storage. 
     
  • JR-26060826-0 Profile Picture
    42 on at
    Hi @Navneeth Nagrajan @Martin Dráb @André Arnaud de Calavon
     

    Thank you for all the previous feedback.

     

    Let me clarify my scenario so you can better understand my question about file generation and storage for very large SAF-T (PT) XML files in Dynamics 365 Finance.


    Microsoft does not provide SAF-T (PT) in the standard Electronic Reporting configuration.

    Therefore, I must generate the SAF-T file fully custom (Which I already do).

     

    The real problem

    In some customers, the SAF-T XML can be HUGE:


    • Monthly SAF-T > 5 GB

    • Annual SAF-T > 100–120 GB


    •  

    What I need

    I want to avoid generating these extremely large XML files on the local disk, so by reading all of your suggestions I will try to emplement the solution of @Navneeth Nagrajan AzureBlobStorage.

     

    Can I then add it as an attachment? for example of a Batch job? and I believe I can still do this

    File::SendFileToUser(memoryStream, xmlFilename);

    but with the reference of the Blob container.

    Thanks again for all the help.

     

  • Martin Dráb Profile Picture
    239,029 Most Valuable Professional on at
    Well, yes, you can store the file in a blob, but you don't need it at all. You wanted to get a stream from the XmlDocument and pass it to sendFileToUser(), which you can, as I explained. You technically can take the data from F&O, store it in a blog, then read the blob again - but all it gives you is what you had at the beginning. It's a waste of resources. Navneeth suggested this solution because he thought you wanted to store the file, which you didn't. My solution avoids the unnecessary storage.
     
    If you put the file to a blob, a better idea is giving user a link to the blob (with SAS), instead of loading the file again to F&O and passing it to sendFileToUser(). What sendFileToUser() does is that it uploads the data to a blob storage, so you'd actually store the same file twice in two different blobs. More wasted resources.
     
    Also, if you use the storage emulator on DEV VMs, the file will still be stored on a disk, which you wanted to avoid. As I explained, it'll happen for sure if you use sendFileToUser(). That we avoided your code where you stored the file doesn't mean that the file won't go to disk at all. And if you use a real Azure storage, it'll be slower, because you'll transfer 120 GB back and forth.
     
    By the way, make sure that all AOSes have at least 250 GB of RAM, because your code will store the 120-GB string in memory two times at least. Good luck with that. But OK, if you're happy with it now, try it. Then you can come back and we talk about how to design the thing meaningfully.

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Introducing the 2026 Season 1 community Super Users

Congratulations to our 2026 Super Stars!

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
Giorgio Bonacorsi Profile Picture

Giorgio Bonacorsi 617

#2
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 461 Super User 2026 Season 1

#3
Syed Haris Shah Profile Picture

Syed Haris Shah 298 Super User 2026 Season 1

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans