Optimizing High-Volume Transaction Queries
Challenge Overview
The Problem
A critical daily report query was taking 45 minutes to execute, causing delays in morning briefings. The table contained over 10 million rows with multiple joins.
My Approach
Analyzed the execution plan and identified a full table scan on the transaction log table. Created a composite non-clustered index on transaction_date and branch_id. Refactored the query to replace a correlated subquery with a CTE.
Key Lesson
"Indexing strategies must align with query patterns. CTEs can improve readability and sometimes performance over subqueries."
Background
The core banking system generates approximately 500,000 transaction records daily. A legacy stored procedure used for the "Daily Branch Performance" report was taking nearly an hour to run, often timing out or blocking other operations.
Diagnosis
I utilized SQL Server Profiler to capture the exact execution parameters and then analyzed the Execution Plan in SSMS. The plan revealed:
- A Table Scan on the `Transactions_Log` table (10M+ rows).
- A high-cost Key Lookup operation due to a missing include column in an existing index.
- A correlated subquery that was executing for every row in the result set.
The Fix
- Indexing: Created a composite non-clustered index on `(TransactionDate, BranchID)` and included `Amount` and `TransactionType` to cover the query requirements.
- Refactoring: Rewrote the correlated subquery as a Common Table Expression (CTE) to allow the optimizer to process the aggregation set once.
- Testing: Verified the new execution plan showed an Index Seek instead of a Scan.
Results
The query execution time dropped from 45 minutes to 42 seconds, a 98% improvement effectively eliminating the morning delay.