In this article, I will talk about Baseline support in DynamicsPerf and how you can use it to resolve performance issues.
Originally, when I was first thinking about Baselines, it was from the perspective of comparing DEV/TEST to Production environments. But the issue we have right now is that we cannot collect from multiple servers so this isn’t available yet but is on my radar scope to add. As version 1.20 started taking shape, I realized that being able to compare query performance to specific points in time would be really, really useful.
It solves 2 major headaches in performance tuning. The first and most powerful feature is for the sudden slowdown. One of the hardest and most stressful things to resolve is that sudden slowdown at 3pm in the afternoon. When all the users and your boss starts calling your cell phone, you know you only have minutes to resolve the issue. You look at there is no database blocking going on. The hardware seems to be running ok, so it has to be some query that is suddenly not running well. How do you track that down quickly when it’s a query that doesn’t take seconds to run but milliseconds? This is where the Baseline capture feature of DynamicsPerf comes into play. When the slowdown occurs, just run the DYNPERF_CaptureStats job and collect data. After that you can open up, 7-Baseline - Benchmark Queries.sql and use the sample scripts to compare the data you just captured to one of your baselines that you created or even to yesterday’s captured data.
First run:
SELECT *
FROM STATS_COLLECTION_SUMMARY
ORDER BY STATS_TIME DESC
This will give you a list of all of the data captures that you have in DynamicsPerf. The first one listed should be your capture that you just listed.
You will need the RUN_NAME column for the Queries_Slower_than_Baseline sample query in this script.
Pick your current capture that you want to compare
Pick the baseline you want to compare against
The next step is use these 2 values in the next query:
Change the RED highlighted text to the appropriate values
----------------------------------------------------------------
--
-- QUERIES_SLOWER_THAN_BASELINE
--
-- queries that got worse from BASELINE
----------------------------------------------------------------
SELECT A.DATABASE_NAME,
A.QUERY_HASH,
A.EXECUTION_COUNT,
A.BEFORE_AVG_TIME,
A.CURRENT_AVG_TIME,
A.[TIME_DIFF(ms)],
A.[%DECREASE],
A.SQL_TEXT,
B.QUERY_PLAN AS BEFORE_PLAN,
C.QUERY_PLAN AS AFTER_PLAN
FROM (SELECT DISTINCT STARTING.DATABASE_NAME,STARTING.QUERY_HASH,
STARTING.EXECUTION_COUNT,
STARTING.AVG_ELAPSED_TIME AS BEFORE_AVG_TIME,
ENDING.AVG_ELAPSED_TIME AS CURRENT_AVG_TIME,
ENDING.AVG_ELAPSED_TIME - STARTING.AVG_ELAPSED_TIME AS 'TIME_DIFF(ms)',
Cast(( ENDING.AVG_ELAPSED_TIME - STARTING.AVG_ELAPSED_TIME ) / CASE STARTING.AVG_ELAPSED_TIME
WHEN 0 THEN 1
ELSE STARTING.AVG_ELAPSED_TIME
END * 100 AS DECIMAL(14, 3)) AS '%DECREASE',
STARTING.SQL_TEXT,
STARTING.QUERY_PLAN_HASH AS BEFORE_PLAN_HASH,
ENDING.QUERY_PLAN_HASH AS AFTER_PLAN_HASH
FROM QUERY_STATS_HASH_VW STARTING
INNER JOIN QUERY_STATS_HASH_VW ENDING
ON STARTING.QUERY_HASH = ENDING.QUERY_HASH AND STARTING.DATABASE_NAME = ENDING.DATABASE_NAME
WHERE STARTING.RUN_NAME = 'BASE_to_compare_to'
AND ENDING.RUN_NAME = 'Feb_26_2020_804AM'
AND STARTING.AVG_ELAPSED_TIME < ENDING.AVG_ELAPSED_TIME
AND STARTING.QUERY_HASH <> 0x0000000000000000) AS A
CROSS APPLY (SELECT TOP 1 QUERY_PLAN
FROM QUERY_PLANS W1
WHERE W1.QUERY_PLAN_HASH = A.BEFORE_PLAN_HASH) AS B
CROSS APPLY (SELECT TOP 1 QUERY_PLAN
FROM QUERY_PLANS W2
WHERE W2.QUERY_PLAN_HASH = A.AFTER_PLAN_HASH) AS C
ORDER BY 7 DESC
You should end up with a result similar to the following:
By default, we sort the list by %decrease. This is by design. What’s worse a query that goes from 1ms to 3ms or a query that goes from 10ms to 15ms? If they have the same number of executions, then 1 to 3 is a 300% change and 10 to 15 is only 50% worse. So, it does depend on how often the query runs as to which one is needs to be fixed first. You can always change the statement to ORDER BY 6 to sort by time difference vs. the %decrease. Both are equally as valid. As I always tell DBAs, fix what you find don’t get hung up on which query is the perfect query to run in DynamicsPerf. They are all valuable.
The second valuable part of the BASELINEs is the ability to find queries that are slowing down over time. These are almost always queries that are scanning the database. So, why do they get slower over time? The answer is that your database is growing and the query is slowing down as the tables grow.
There are many reasons why a query will get slower:
- Blocked query
- Parmater sniffing
- Query scanning data
At this point you can dive into the queries and begin the performance optimization phase of your task. That is beyond the scope of this article.
As you can see, the BASELINE functionality of DynamicsPerf is a very powerful tool for quickly resolving query performance issues.
Rod “Hotrod” Hansen

Like
Report
*This post is locked for comments