...

/

Analysis of Quicksort

Analysis of Quicksort

How is it that quicksort's worst-case and average-case running times differ? Let's start by looking at the worst-case running time. Suppose that we're really unlucky and the partition sizes are really unbalanced. In particular, suppose that the pivot chosen by the partition function is always either the smallest or the largest element in the n-element subarray. Then one of the partitions will contain no elements and the other partition will contain n1n−1 elements—all but the pivot. So the recursive calls will be on subarrays of sizes 00 and n1n−1. As in merge sort, the time for a given recursive call on an n-element subarray is Θ(n)\Theta(n). In merge sort, that was the time for merging, but in quicksort it's the time for partitioning.

Worst-case running time

When quicksort always has the most unbalanced partitions possible, then the original call takes cncn time for some constant cc, the recursive call on n1n−1 elements takes c(n1)c(n−1) time, the recursive call on n2n−2 elements takes c(n2)c(n−2) time, and so on. Here's a tree of the subproblem sizes with their partitioning times:

When we total up the partitioning times for each level, we get:

The last line is because 1+2+3++n1+2+3+ \cdots +n is the arithmetic series, as we saw when we analyzed selection sort. (We subtract 11 because for quicksort, the summation starts at 22, not 11.) We have some low-order terms and constant coefficients, but when we use big–Θ\Theta notation, we ignore them. In big–Θ\Theta notation, quicksort's worst-case running time is Θ(n2)\Theta(n2).

Best-case running time

Quicksort's best case occurs when the partitions are as evenly balanced as possible: their sizes either are equal or are within 11 of each other. The ...