A stable sort maintains the relative order of items that have the same key. For example, imagine your data set contains records with an employee id and a name. The initial order is:
1, Jim
2, George
3, Jim
4, Sally
5, George
You want to sort by name. A stable sort will arrange the items in this order:
2, George
5, George
1, Jim
3, Jim
4, Sally
Note that the duplicate records for “George” are in the same relative order as they were in the initial list. Same with the two “Jim” records.
An unstable sort might arrange the items like this:
5, George
2, George
1, Jim
3, Jim
4, Sally
Heapsort is not stable because operations on the heap can change the relative order of equal items. Not all Quicksort implementations are stable. It depends on how you implement the partitioning.
Although Heapsort has a worst case complexity of O(n log(n))
, that doesn’t tell the whole story. In real-world implementation, there are constant factors that the theoretical analysis doesn’t take into account. In the case of Heapsort vs. Quicksort, it turns out that there are ways (median of 5, for example) to make Quicksort’s worst cases very rare indeed. Also, maintaining a heap is not free.
Given an array with a normal distribution, Quicksort and Heapsort will both run in O(n log(n))
. But Quicksort will execute faster because its constant factors are smaller than the constant factors for Heapsort. To put it simply, partitioning is faster than maintaining the heap.