web
You’re offline. This is a read only version of the page.
close
Skip to main content

Notifications

Announcements

Community site session details

Community site session details

Session Id :
Finance | Project Operations, Human Resources, ...
Suggested Answer

Database restoration of 2tb in d365 FO

(0) ShareShare
ReportReport
Posted on by 139
Hi,
 
I have a prod database which is 2 tb in size and want to restore a copy in dev.
Can you please let me know the std procedure to cleanup and shrink the database so that i can be restored?
 
Thanks,
Vikas
Categories:
I have the same question (0)
  • Suggested answer
    Anton Venter Profile Picture
    20,564 Super User 2025 Season 2 on at
     
    There is no standard procedure for this, but generally comes down to this (give or take):
     
    • Set database log mode to simple if possible (this reduces the log file size drastically).
    • Determine which tables to target by running the table size report in SQL Server Management Studio to determine row counts and size. There are also T-SQL queries for doing this.
    • Truncate the "large" tables containing log data that you don't need.
    • Partially delete data from transaction tables that contain the most data. You will have to figure out how much data to delete and to keep.
    • Resize database -> database and files (optional I think but I recommend doing it)
     
     
     
  • Suggested answer
    Diego Mancassola Profile Picture
    394 on at
    Hello, only if you are using LCS, you can remove data directly from BACPAC file upon restore to your cloud hosted.
     
    First of all download there tools: https://github.com/d365collaborative/d365fo.tools
     
    Than using this command from powershell you can remove data from specific tables:
    Clear-D365TableDataFromBacpac -Path "h:\backup.bacpac" -TableName "SECURITYOBJECTHISTORY","BatchJobHistory", "BOMCALCTRANS", "COSTSHEETCACHE", "DOCUHISTORY", "DOCUREF", "BATCH", "WHSWAVEEXECUTIONHISTORY"
  • Suggested answer
    André Arnaud de Calavon Profile Picture
    302,559 Super User 2025 Season 2 on at
    Hi Vikas,
     
    As mentioned by Anton, you need to take care of this yourself. If you have a database backup, it can be only restored as a whole, after that. you can clean data.
     
    You can also consider cleaning the production environment as 2TB is a lot of data. At a client, I found 1TB of data in the database log which is an easy win if you perform a cleanup on old and for that reason obsolete logs.
     
    Via Power Platform Admin Center, you can check the capacity usage and table sizes of production tables. You can then check where to focus for cleaning up logs and more. There are a lot of cleanup routines available in the standard and Microsoft has created a solution for archiving data.
    If you cleanup from the source, the data to be copied is smaller in size and will be faster. Once copied, you can also consider truncating some other transaction tables not required for a development environmnet.
  • Suggested answer
    Sagar121 Profile Picture
    1,084 Moderator on at
    Hi, please run the below query in your DB. It will show the top 50 tables with maximum data size.
    then review the tables.  You should go for custom tables first as you already know the schema and its dependency with other tables.
    Typically, document(doucref), batch, database log, and DMF tables are good candidates for data cleanup as they have huge data with them.
    This query should kick start you to reduce database size.
     
    -- Top N largest tables by used space (GB)
    -- Change @TopN as needed
    DECLARE @TopN int = 50;
     
    WITH TableSizes AS (
        SELECT
            s.name  AS schema_name,
            t.name  AS table_name,
            SUM(ps.row_count) AS row_count,
            -- Sizes in MB
            CAST(SUM(ps.reserved_page_count) * 8.0 / 1024 AS DECIMAL(19,2)) AS reserved_mb,
            CAST(SUM(ps.used_page_count)     * 8.0 / 1024 AS DECIMAL(19,2)) AS used_mb,
            CAST((
                SUM(ps.in_row_data_page_count)
              + SUM(ps.lob_used_page_count)
              + SUM(ps.row_overflow_used_page_count)
            ) * 8.0 / 1024 AS DECIMAL(19,2)) AS data_mb,
            CAST((
                SUM(ps.used_page_count)
              - (SUM(ps.in_row_data_page_count)
              +  SUM(ps.lob_used_page_count)
              +  SUM(ps.row_overflow_used_page_count))
            ) * 8.0 / 1024 AS DECIMAL(19,2)) AS index_mb
        FROM sys.dm_db_partition_stats AS ps
        INNER JOIN sys.tables  AS t ON t.object_id = ps.object_id
        INNER JOIN sys.schemas AS s ON s.schema_id = t.schema_id
        GROUP BY s.name, t.name
    )
    SELECT TOP (@TopN)
        schema_name,
        table_name,
        row_count,
        -- Convert to GB
        CAST(reserved_mb / 1024.0 AS DECIMAL(19,2)) AS reserved_gb,
        CAST(used_mb     / 1024.0 AS DECIMAL(19,2)) AS used_gb,
        CAST(data_mb     / 1024.0 AS DECIMAL(19,2)) AS data_gb,
        CAST(index_mb    / 1024.0 AS DECIMAL(19,2)) AS index_gb
    FROM TableSizes
    ORDER BY used_gb DESC;

Under review

Thank you for your reply! To ensure a great experience for everyone, your content is awaiting approval by our Community Managers. Please check back later.

Helpful resources

Quick Links

Responsible AI policies

As AI tools become more common, we’re introducing a Responsible AI Use…

Congratulations to our 2025 Community Spotlights

Thanks to all of our 2025 Community Spotlight stars!

Leaderboard > Finance | Project Operations, Human Resources, AX, GP, SL

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 673 Super User 2025 Season 2

#1
André Arnaud de Calavon Profile Picture

André Arnaud de Cal... 673 Super User 2025 Season 2

#3
Abhilash Warrier Profile Picture

Abhilash Warrier 391 Super User 2025 Season 2

Last 30 days Overall leaderboard

Product updates

Dynamics 365 release plans