The worst case time complexity of a typical implementation of QuickSort is O(n2). The worst case occurs when the picked pivot is always an extreme (smallest or largest) element. This happens when input array is sorted or reverse sorted and either first or last element is picked as pivot.
Although randomized QuickSort works well even when the array is sorted, there is still possibility that the randomly picked element is always an extreme. Can the worst case be reduced to O(nLogn)?
The answer is yes, we can achieve O(nLogn) worst case. The idea is based on the fact that the median element of an unsorted array can be found in linear time. So we find the median first, then partition the array around the median element.
Following is C++ implementation based on above idea. Most of the functions in below progran are copied from K’th Smallest/Largest Element in Unsorted Array | Set 3 (Worst Case Linear Time)
Sorted array is 1 5 6 7 8 9 10 20 30 900 1000
How is QuickSort implemented in practice – is above approach used?
Although worst case time complexity of the above approach is O(nLogn), it is never used in practical implementations. The hidden constants in this approach are high compared to normal Quicksort. Following are some techniques used in practical implementations of QuickSort.
1) Randomly picking up to make worst case less likely to occur (Randomized QuickSort)
2) Calling insertion sort for small sized arrays to reduce recursive calls.
3) QuickSort is tail recursive, so tail call optimizations is done.
So the approach discussed above is more of a theoretical approach with O(nLogn) worst case time complexity.
This article is compiled by Shivam. Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above