quicksort
O(n * log(n))
quicksort's average case time complexity
random
quicksort's best case scenario
O(log(n))
space complexity for quicksort's best case
O(n)
time complexity for merge
O(n * log(n))
time complexity for merge SORT
O(n * log(n))
time complexity for quicksort's average case
O(n * log(n))
time complexity for quicksort's best case
O(n^2)
time complexity for quicksort's worst case
stable sorts
2 items with the same value cannot switch positions
unstable sorts
2 items with the same value may switch positions
1. insertion sort 2. selection sort 3. merge sort 4. heapsort 5. quicksort 6. bubble sort 7. Timsort
7 sorting algorithms
random value, or the median of several values
how to choose a pivot in quicksort
O(n^2)
insertion sort's average case time complexity
already sorted
insertion sort's best case scenario
O(n)
insertion sort's best case time complexity
O(1)
insertion sort's space complexity
sorted backwards
insertion sort's worst case scenario
O(n^2)
insertion sort's worst case time complexity
O(n)
merge sort's space complexity
O(n * log(n))
merge sort's time complexity
quicksort can be done in-place without requiring extra space
one advantage quicksort has over mergesort
quicksort
- a sorting algorithm where in each step, it picks a value called the pivot and divides the array into 2 parts: values larger than the pivot and values smaller - this continues until arrays of size 1 are reached, at which point the entire array is sorted - we want to divide the array into smaller and larger parts and put the pivot in between them - if we see a smaller value, increase the size of the smaller part and put the value in the smaller part - when we're done, we'll know where to put the pivot - in the worst case the problem only gets 1 unit smaller in each step - good implementations try to avoid picking bad pivot values
Timsort
- the adaptive sorting algorithm used by Python and now Java - far more complex than any of the algorithms we've discussed, but tries to take advantage of runs of already-sorted values in the data - internally it uses both merge sort and insertion sort to sort smaller arrays and combine them together
adaptive sorting algorithm
a type of sorting algorithm that adjusts its behavior to features of the data
nope
does merge sort have a "best case" or a "worst case"?
divide and conquer
an algorithm design paradigm based on multi-branched recursion. This algorithm works by recursively breaking down a problem into 2 or more sub-problems of the same or related type, until these become simple enough to be solved directly. The solutions to the sub-problems are then combined to give a solution to the original problem.
O(n * log(n))
quicksort's best case time complexity
O(log(n))
quicksort's space complexity
sorted (order depends on pivot choice)
quicksort's worst case scenario
O(n^2)
quicksort's worst case time complexity
O(n)
space complexity for merge SORT
O(log(n))
space complexity for quicksort's average case
O(n)
space complexity for quicksort's worst case
quicksort, because it's best-case performance is as good as merge sort but it can be done using much less memory
what sorting algorithm do you use if you are short on space?
insertion sort, because it avoids the recursive calls made by merge sort and quick sort and is fastest on small arrays
what sorting algorithm do you use if you have a very small array?
merge sort, because it's performance doesn't vary based on its inputs, although it requires O(n) space
what sorting algorithm do you use if you want predictable performance?