Books / Sorting Algorithms in Computer Science / QuickSort
QuickSort
Overview of QuickSort
QuickSort is the fastest known sorting algorithm. It was developed by C.A.R. Hoare in 1960 when he was just 26 years old, and published in the Communications of the ACM in 1961; it revolutionized computer science. He was knighted in 1980.
Quicksort has an average running time of O(N log N) but a worst case performance of \(O\left(N^{2}\right)\). The reason that Quicksort is considered the fastest algorithm for sorting when used carefully, is that a good design can make the probability of achieving the worst case negligible.
In this chapter, we’ll review the algorithm and analyze its performance.
QuickSort Algorithm
Basic Idea
Quicksort is a divideandconquer sorting algorithm. The general idea is to pick an element from the array and to use that element to partition the array into two disjoint sets: one that consists of all elements no larger than the selected element, and the other consisting of elements no smaller than the selected element. The first set will reside in the lower part of the array and the second in the upper part. The selected element will be placed between the two. This process is recursively repeated on each of the two sets so created until the recursion reaches the point at which the set being partitioned is of size one.
Let S represent the set of elements to be sorted, i.e., S is an array. Let quicksort(S) be defined as the following sequence of steps:

If the number of elements in S is 0 or 1, then return, because it is already sorted.

Pick any element v in S. Call this element v the pivot.

Partition \((Sv)\) into two disjoint sets \(S_{1}=\{x \in S \mid x \leq v\}\) and \(S_{2}=\{x \in S \mid x \geq v\}\) with some elements equal to v in \(S_{1}\) and others in \(S_{2}\)

Return quicksort \(\left(S_{1}\right)\) followed by v followed by quicksort \(\left(S_{2}\right)\).
To illustrate, suppose an array contains the elements:
8 1 4 9 0 3 5 2 7 6
Suppose that we pick 6 as the pivot. Then after partitioning the array it will look like:
1 4 0 3 5 2 6 8 9 7
where S1 is the set {1,4,0,3,5,2} and S2 is the set {8,9,7}. Space is inserted just to identify the sets. If we recursively repeat this to these sets S1 and S2 (and assume by an induction argument that it does indeed sort them), then after the recursive calls to Quicksort S1 and S2, the array will be in the order
0 1 2 3 4 5 6 7 8 9
Picking a Pivot
A poor choice of pivot will cause one set to be very small and the other to be very large. The result will be that the algorithm achieves its \(O\left(n^{2}\right)\) worst case behavior. If the pivot is smaller than all other keys or larger than all other keys this will happen. We want a pivot that is the median element without the trouble of finding the median(because that is too expensive.) One could pick the pivot randomly, but then there is no guarantee about what it will be, and the cost of the random number generation buys little savings in the end.
A good compromise between safety and performance is to pick three elements and take their median. Doing this almost assuredly eliminates the possibility of quadratic behavior. A good choice is to choose the first, last, and middle elements of the array.
Partitioning
The best way to partition is to push the pivot element out of the way by placing it at the beginning or end of the array. Having done that, all of the implementations do basically the same thing, except for how they handle elements equal to the pivot.
The general idea is to advance i and j pointers towards the middle swapping elements larger than the pivot until i and j cross. The following example shows an array with the initial placements of the i and j pointers and the pivot, after it has been moved to the last position in the array. In each step, first j travels down the array looking for a culprit that doesn’t belong, and then i travels up the array doing the same thing. When they each stop, the elements are swapped and they each advance one element. In the following example, the lowest index element is not less than or equal to the pivot, as would be true if a smarter way of choosing the pivot were used.
\[\begin{array}{llllllllll}8 & 1 & 4 & 9 & 0 & 3 & 5 & 2 & 7 & 6 \\ \mathbf{i} & & & & & & & & \mathbf{j} & \text { pivot } \\ 8 & 1 & 4 & 9 & 0 & 3 & 5 & 2 & 7 & 6 \\ \mathbf{i} & & & & & & & \mathbf{j} & & \\ 2 & 1 & 4 & 9 & 0 & 3 & 5 & 8 & 7 & 6 \\ & \mathbf{i} & & & & & \mathbf{j} & & & \\ 2 & 1 & 4 & 9 & 0 & 3 & 5 & 8 & 7 & 6 \\ & & & \mathbf{i} & & & \mathbf{j} & & & \\ 2 & 1 & 4 & 5 & 0 & 3 & 9 & 8 & 7 & 6 \\ & & & & \mathbf{i} & \mathbf{j} & & & & \\ 2 & 1 & 4 & 5 & 0 & 3 & 9 & 8 & 7 & 6 \\ & & & & & \mathbf{j} & \mathbf{i} & & & \end{array}\]It stops at this point and the pivot is swapped, so the final array is
\[\begin{array}{llllllllll} 2 & 1 & 4 & 5 & 0 & 3 & 6 & 8 & 7 & 9 \end{array}\]The real issue is handling elements that are equal to the pivot. The safest way to handle elements equal to the pivot is to swap them as if they were smaller or larger, otherwise the algorithm can degrade to the worst case.
A recursive version of quick sort that uses this strategy is in the listing below.
Note: Code in this section is written in C++. Source code in Java, Python and other languages can be found at the end of this chapter.
void quicksort(T A[], int left, int right) {
// make sure array has at least ten elements :
if (left + 10 <= right) {
// pick three values , sort them , and put the pivot into A [ right ] ,
// the smallest of the three into A [ left ] and the largest into A [right  1].
// assume that the function choose_pivot () does all of this .
T pivot = median3(A, left, right);
// A [ left ] <= pivot <= A [ right 1] and pivot is A [ right ]
// now partition
int i = left;
int j = right  1;
bool done = false;
while (!done) {
while (A[++i] < pivot) {} // advance i
while (pivot < A[ j]) {} // advance j
// A [ j ] <= pivot <= A [ i ]
if (i < j)
swap(A[i], A[j]);
else
done = true;
}
// now j <= i and A [ i ] >= pivot and A [ j ] <= pivot
// so we can swap the pivot with A [ i ]
swap(A[i], A[right]);
// Now the pivot is between the two sets and in A [ i ]
// quicksort the left set :
quicksort(A, left, i  1);
// quicksort the right set :
quicksort(A, i + 1, right);
} else {
// A is too small to quicksort efficiently
insertion_sort(A, left, right);
}
}
The function median3(...)
is listed below but first let’s review a few important points in the code above:

That i starts at left+1 is intentional. The median3 function rearranges the array so that \(\mathrm{a}[\) left \(] \leq\) pivot, so it is not necessary to examine it.

A symmetric statement is true of j: median3 places the pivot in a[right1] so j can start by comparing pivot to a [right2] .

Neither i nor j can exceed the array bounds because of the above reasoning. The ends act as sentinels for i and j.
/*
* @post a [ first ] <= a [( first + last ) /2] <= a [ last 1] && a [ last ] contains
the
* median of the original values in locations a [ first ] , a [ last ] , and
* a [( first + last ) /2].
*/
// Note: C is int
C median3(vector < C > & a, int first, int last) {
int middle = (first + last) / 2;
C temp; // Note: C is int
if (a[first] < a[middle])
if (a[middle] < a[last]) {
// first < mid < last
temp = a[last  1];
a[last  1] = a[last];
a[last] = a[middle]; // middle is pivot
a[middle] = temp;
}
else if (a[last] < a[first]) {
// last < first < middle
temp = a[last  1];
a[last  1] = a[middle];
a[middle] = temp;
temp = a[last];
a[last] = a[first]; // first is pivot
a[first] = temp;
} else {
// first < last < middle
temp = a[last  1];
a[last  1] = a[middle];
a[middle] = temp;
// last is pivot
} else if (a[middle] < a[last]) {
if (a[last] < a[first]) {
// middle < last < first
temp = a[last  1];
a[last  1] = a[first];
a[first] = a[middle];
a[middle] = temp; // last is pivot
} else {
// middle < first < last
temp = a[last  1];
a[last  1] = a[last];
a[last] = a[first];
a[first] = a[middle]; // first is pivot
a[middle] = temp;
}
} else {
// last < middle < first
temp = a[last  1];
a[last  1] = a[first];
a[first] = a[last];
a[last] = a[middle]; // middle is pivot
a[middle] = temp;
}
return a[last];
}
Analysis of Quicksort (Time and Space Complexity)
Here’s a quick summary of time and space complexity of the Quicksort algorithm, with details below.
 Worstcase performance: \(O(N^2)\)
 Bestcase performance: O(N Log N) (simple partition) or O(N) (threeway partition and equal keys)
 Average performance: O(N Log N)
 Space complexity: The inplace version of Quicksort has a space complexity of O(log n).
The outer loop of Quicksort iterates until loop indexes i and j pass each other. They start at opposite ends of the array and advance towards each other with each key comparison. Therefore, roughly n key comparisons take place for an array of size n. The function is called twice recursively, once for each set. Suppose that the pivot is chosen very badly each time and that it is always the largest element in the set. Then the upper set will be empty and the lower set will be roughly size n1. This means that the recursive call to the lower set will make n1 key comparisons and that to the upper will not even execute. This implies that the recursion will descend down the lower sets, as they get smaller by one each time, from n, to n1 to n2 to *n3 *and so on until the last call has an array of size 3, when it stops. The total number of key comparisons is therefore roughly:
3 + 4 + 5 + · · ·(n − 2) + (n − 1) + n ≈ n(n − 1)/2 = \(O\left(N^{2}\right)\)
Thus, the worst case of Quicksort is \(O\left(N^{2}\right)\). That is why it is important to choose the pivot properly. A poor choice of pivot will cause one set to be very small and the other large. The result will be that the algorithm achieves its \(O\left(N^{2}\right)\) worst case behavior. If the pivot is smaller than all other keys or larger than all other keys this will happen. Quicksort is best when the pivots divide the array roughly in half every time. Since the array is divided in half each time, the depth of recursion is log2 n, when the array is sorted. It can be proved that each level of the recursion uses O(N) key comparisons, so that at best it takes O(N log N) steps. It can be proved that the average case is O(N log N) as well. In fact, it can be proved that the average case requires about 1.39 times the number of comparisons as the best case.
A nonrecursive Quicksort Algorithm
Although function call overhead is not what it used to be, recursion in Quicksort reduces its performance because of the time to create and maintain stack frames. A nonrecursive quicksort can depend on a userdefined stack and avoid the runtime library’s overhead. The version below refers to generic pop and push operations. The function median3 is the one defined earlier.
typedef int C; // Note: C is int
void swap(C & x, C & y) {
C temp = x;
x = y;
y = temp;
}
struct Pair {
Pair(int x, int y): l(x), r(y) {}
int l, r;
};
void quicksort(vector < C > & a, int last) {
stack < Pair > s;
int left, right;
int i, j;
Pair temp(0, last);
s.push(temp);
while (!s.empty()) {
temp = s.top();
s.pop();
left = temp.l;
right = temp.r;
while (left < right) {
/* or left < ( right  MINSIZE ) */
C pivot = median3(a, left, right);
// partition step
i = left;
j = right  1;
while (true) {
while (a[++i] < pivot) {}
while (pivot < a[ j]) {}
if (i < j)
swap(a[i], a[j]);
else
break;
}
// move pivot into position in middle
swap(a[i], a[right]);
// pick larger of two regions to push on stack , and do the
// smaller within the loop . This guarantees that the stack
will
// never have more than log_2 ( n ) pairs of parameters .
if ((right  i) > (i  left)) {
if (right > i)
s.push(Pair(i + 1, right));
right = i  1;
} else {
if (i > left)
s.push(Pair(left, i  1));
left = i + 1;
}
}
// could do the insertion sort here if right  left < MINSIZE
}
}
This version is designed so that the larger of the two choices of partition is always put on the stack, and the smaller is processed immediately. By doing this we prevent the stack from ever having more than log N frames. If we did not so this, the worst case stack would be O(N) frames.
Common Quicksort Questions
1. Is Quicksort inplace sorting algorithm?
Broadly speaking, it qualifies as inplace sorting algorithm. It uses extra space but doesn’t need to make full or partial copies of the input array.
2. Is Quicksort adaptive?
No. In adaptive algorithms, the complexity (running time, performance) changes based on whether or not the input array is sorted. They perform better when the input is sorted. Heap sort is not adaptive because it uses O(N log N) comparisons even when the input is already sorted.
3. Is Quicksort stable?
No. Stable means that if two elements have the same value, they remain in the same position. QuickSort is not stable. For more information, see this for more information.
4. Is Quicksort divide and conquer?
Yes. A divide and conquer algorithm continuously breaks down a bigger problem into subproblems, until they become simple enough to be solved directly. QuickSort is a divide and conquer algorithm.
Code Examples
QuickSort in Java
import java.util.*;
class Quick_Sort {
public static int[] QuickSort(int[] arr, int elements) {
if(elements < 2){ //Base Case
return arr;
}
int current_position=0; //position of pivot element
int temp; //a temporary variable to assist in swapping
for(int i=1; i<elements; i++) //Partitioning loop
{
if(arr[i] <= arr[0])
{
current_position++;
temp = arr[i];
arr[i] = arr[current_position];
arr[current_position] = temp;
}
}
temp = arr[0];
arr[0] = arr[current_position];
arr[current_position] = temp; //Brings pivot to it's appropriate position
int[] left = QuickSort(arr,current_position); //sorts the elements to the left of pivot
int[] arr2 = Arrays.copyOfRange(arr, current_position+1, elements);//separates elements right of pivot
int[] right = QuickSort(arr2, elementscurrent_position1); //sorts the elements to the right of pivot
int[] final_array = new int[elements]; //final array, to merge everything together
for(int i=0; i<current_position; i++)
{
final_array[i] = left[i];
}
final_array[current_position] = arr[current_position];
for(int i=current_position+1; i<elements; i++)
{
final_array[i] = right[icurrent_position1];
}
return final_array;
}
public static void main( String args[] ) {
int[] array = {4,2,7,3,1,6}; //array to be sorted
System.out.print("Original Array: [");
for(int i=0; i<array.length;i++)
{
System.out.print(array[i]);
System.out.print(" ");
}
System.out.println("]");
array = QuickSort(array, array.length);//sorting
System.out.print("Sorted Array: ["); //printing the sorted array
for(int i=0; i<array.length;i++)
{
System.out.print(array[i]);
System.out.print(" ");
}
System.out.print("]");
}
}
QuickSort in Python
def QuickSort(arr):
elements = len(arr)
#Base case
if elements < 2:
return arr
current_position = 0 #Position of the partitioning element
for i in range(1, elements): #Partitioning loop
if arr[i] <= arr[0]:
current_position += 1
temp = arr[i]
arr[i] = arr[current_position]
arr[current_position] = temp
temp = arr[0]
arr[0] = arr[current_position]
arr[current_position] = temp #Brings pivot to it's appropriate position
left = QuickSort(arr[0:current_position]) #Sorts the elements to the left of pivot
right = QuickSort(arr[current_position+1:elements]) #sorts the elements to the right of pivot
arr = left + [arr[current_position]] + right #Merging everything together
return arr
array_to_be_sorted = [4,2,7,3,1,6]
print("Original Array: ",array_to_be_sorted)
print("Sorted Array: ",QuickSort(array_to_be_sorted))
Summary
Quicksort is the fastest known sorting algorithm, on average, when implemented correctly, which means that it should be written without using recursion, using a stack instead, and putting the larger partition onto the stack each time to store the parameters of the recursive call. Instead of making the two recursive calls, the parameters of one of the recursive calls are pushed on the stack and the other call is executed within the while loop again. The stack is popped to retrieve the parameters and simulate the call. To avoid the stack growing too large, the call for the larger partition is pushed onto the stack and the one for the smaller is executed immediately.