Algorithms Midterm
How many key comparisons are in selection sort?
1 + 2 + .... + (n-1) = n(n-1) / 2
What is Quicksort?
A divide-and-conquer recursive algorithm that is fastest in practice.
What is the average case of sequential sort?
O(n)
What is the worst case of sequential sort?
O(n)
What is the best case of Insertion sort?
O(n) (it's already sorted)
Counting sort is not in place. True or False?
True
Heap sort is an in-place sorting algorithm. True or False?
True
Heap sort is not a stable algorithm. True or False?
True
Merge sort is stable. True or false?
True
Radix sort is stable. True or False?
True
What is radix sort?
Uses counting sort of individual digits starting from the least significant digit first
What is introsort?
Uses quicksort then switches to heapsort when the number of recursive calls reaches 2logn
Quicksort is faster in practice than _______sort, which is faster than ______sort.
heap, merge
Uses for binary search
integer square root problem and the finding the greatest common divisor
a in the T(n) equation
number of recursive calls
What is merge sort?
Algorithm that divides the problem into two subproblems and recursively sorts the subproblems. Merges all the sub problems together once they are solved.
What is the time complexity of Trial Division?
Exponential time
Counting sort is not stable. True or False?
False
Insertion sort does not sort in place. True or false?
False
What is Euclid's GCD algorithm?
Finds the greatest common divisor for two numbers by recursively finding ( a mod b ) until b is zero
Worst case of Euclid's GCD algorithms?
It has O (log(a)) iterations, so it has O(N) accounting for bit size
What is Trial Division?
It is a simple way to find the integer factorization of a number by checking ( n mod i ) for all integers up to the square root of n. (It is better not to check i's that are multiples of 2, 3, or 5)
What is interpolation sort?
It uses knowledge about the distribution of values in the array.
What is counting sort?
Makes each input have a rank (based off the number of elements less than it) and re-places it based of its rank.
What is an asymptotically optimal comparison sort?
Merge sort ( O(nlogn) )
best case of binary search
O ( 1 )
What is the worst case of interpolation sort?
O ( N )
What is the average case of interpolation sort?
O ( log (logn) ) with a uniform distribution of keys
Max-heapify running time is ?
O ( logn )
Build max heap running time is ?
O ( n )
average case of binary search
O (logn)
worst case of binary search
O (logn)
What is the runtime of Counting sort?
O (n)
What is the time complexity of radix sort?
O (n)
What is the average case of bucket sort?
O (n) [when the input is a uniform distribution over the interval of 0 to 1]
Time complexity of Egyptian, Lattice, and Grade-School multiplication
O (n^2)
What is the average case of insertion sort?
O (n^2)
What is the time complexity of selection sort for all cases?
O (n^2)
What is the worst case of bucket sort?
O (n^2)
What is the worst case of insertion sort?
O (n^2) (it's reverse sorted)
What is the time complexity of heapsort?
O (nlogn) [ max-heapify + build max heap ]
What is the best case of sequential search?
O(1)
integer factorization
Process of finding all of a number's prime factors and how many times they occur
What is heapsort?
Similar to selection sort as an array representation.
What is the recurrence relation for a recursive algorithm that splits the input into 2 halves and does a constant amount of work?
T(n) = 2T(n/2) + 1 / Θ (n)
What is the recurrence relation for a recursive algorithm that loops through the input to eliminate one item? (Can't be solved with the master theorem)
T(n) = T(n-1) + n [ Θ (n^2) ]
What is the recurrence relation for an algorithm that halves the input in one step?
T(n) = T(n/2) + c / Θ (logn)
What is the recurrence relation for an algorithm that halves the input but must examine every item in the input?
T(n) = T(n/2) + n / Θ (n)
Why is counting sort impractical?
The range for k is limited and the k for bits is like 2^32.
What is Quick Select?
This algorithm addresses the selection problem (of picking of pivot for quicksort, for example). It can find the smallest or median element by partitioning a set and looking for the ith element on either side of the pivot.
When is insertion sort better than quicksort?
When the input size is less 20
What is selection sort?
Works by finding the smallest element in the array
What is the worst, best, and average case of hashing?
Worst: O(n), Best: O(1), Average: O(1)
Big Omega (Ω)
asymptotic lower bound ( f(n) >= g(n) )
Big O
asymptotic upper bound ( f(n) <= g(n) )
Order these terms from slowest to fastest growth: polynomial, factorial, exponential, constant, polylogarithmic, logarithmic
constant, logarithmic, polylogarithmic, polynomial, exponential, factorial
f(n) in the T(n) equation
extra work done by the algorithm
n/b in the T(n) equation
size of the input for the recursive call
Big Theta (Θ)
tight bound
Little omega (ω)
tight lower bound ( f(n) > g(n) )
Little o
tight upper bound ( f(n) < g(n) )
What is the average case of Quick Select?
Θ (n)
What is the best case of Quick Select?
Θ (n) [if we always partition around the median element]
What is the worst case of Quick Select?
Θ (n^2) [if you partition around the smallest/largest element]
What is the worst case time complexity of Quicksort?
Θ (n^2) [picking the end pivot in a sorted array, for example]
What is the average time complexity of Quicksort?
Θ (nlogn) [even if partition always has a 9 to 1 balance]