There are several sorting algorithms; each one has a different best and worst-case time complexity and is optimal for a particular type of data structure. Let’s look at some of the sorting algorithms that have a best-case or average-case time complexity of .
Merge sort is a classic example of a divide-and-conquer algorithm. It has a guaranteed running time complexity of in the best, average, and worst case. It essentially follows two steps:
Read up on how to implement a merge sort algorithm here.
Heapsort uses the heap data structure for sorting. The largest (or smallest) element is extracted from the heap (in time), and the rest of the heap is re-arranged such that the next largest (or smallest) element takes time. Repeating this over elements makes the overall time complexity of a heap sort . Learn how to implement a heap sort here.
Like merge sort, quick sort is a divide-and-conquer algorithm that follows three essential steps:
However, the choice of the pivot actually determines the performance of quicksort. If the first or the last element of the array is chosen as a pivot, then quicksort has a worst-case time complexity of . But, if a good pivot is chosen, the time complexity can be as good as with performance exceeding that of merge sort. An implementation of quicksort in Java is given here.