For additional reading and resources, you can refer to https://jeffe.cs.illinois.edu/teaching/algorithms/book/00-intro.pdf and https://jeffe.cs.illinois.edu/teaching/algorithms/book/01-recursion.pdf
Earlier today, we learned about selection sort. Quicksort is another type of sort algorithm we will learn.
The high level algorithm for quicksort consists of three steps:
Below is a visual of what this looks like:
A key component of quicksort is the partition function. Partition is responsible for putting elements on the correct side of the pivot. We will start by writing the partition function.
First, let's think about how to select a pivot element from the array. What are some ways we could select a pivot? Discuss with your neighbor.
# How can we select a pivot?
# last or first element, random, always pick middle
Once we have selected a pivot, we need to actually partition elements based on the pivot. Think about what steps we would need to:
# How can we figure out whether an element goes to the right or left of the pivot?
# How can we move elements from one side of the pivot to the other?
# Partition
def partition(arr, left, right):
pivot = arr[right]
pivot_position = left
for elem in range(left, right):
if arr[elem] <= pivot:
arr[pivot_position], arr[elem] = arr[elem], arr[pivot_position]
pivot_position = pivot_position + 1
arr[pivot_position], arr[right] = arr[right], arr[pivot_position]
return pivot_position
# Quicksort
def quicksort(arr, low, high):
if low < high:
pivot = partition(arr, low, high)
quicksort(arr, low, pivot - 1) # left
quicksort(arr, pivot + 1, high) # right
l_unsorted = [7, 1, 9, 20, 4, 6]
print(l_unsorted)
quicksort(l_unsorted, 0, len(l_unsorted) - 1)
print(l_unsorted)
[7, 1, 9, 20, 4, 6] [1, 4, 6, 7, 9, 20]
Quicksort time complexity can be dependent on your choice of pivot for the best case. Picking a good pivot can make a difference for how fast your algorithm runs.
For example, what if we were able to pick the MEDIAN element of the array as the pivot each time? This means when we partition, we end up with two subarrays of equal size on the left and right of the pivot. This actually ends up giving us a best case run time of $nlog(n)$ since we don't need to recurse n times.
Now, let's think through the worst case time complexity of quicksort. Let's assume we are picking the rightmost value in our array as our pivot. What are some cases where we would have the worst case time complexity? Discuss with your neighbor.
# When do we have worst case quicksort?
# ascending or descendng sort, same element
What is quicksort time complexity? Let's take a look at the code we wrote and see if we can figure it out.
# Quicksort function code
# Partition
def partition(arr, left, right):
pivot = arr[right] # O(1)
current = left # O(1)
for elem in range(left, right): # O(n)
if arr[elem] <= pivot: # O(1)
# O(1)
arr[current], arr[elem] = arr[elem], arr[current]
current = current + 1 # O(1)
# O(1)
arr[current], arr[right] = arr[right], arr[current]
return current # O(1)
def quicksort(arr, low, high): # O(N^2)
if low < high:
pivot = partition(arr, low, high) # O(N)
quicksort(arr, low, pivot - 1) # left
quicksort(arr, pivot + 1, high) # right