Analysis of Algorithms Quiz Questions
Give the recurrence for the worst-case running time for QuickSort, and give the time complexity
T(n) = T(n-1) + O(n) = O(n^2)
Suppose an O(n) time algorithm that finds the median of an unsorted array. Now consider a quicksort implementation where we first find the median using the above algorithm, then use the median as the pivot. What is the worst-case time complexity of this algorithm? a. O(n^2 * lgn) b. O(nlgn) c. O(nlgnlgn) O(n^2)
b. O(nlgn)
Which of the following is O(nlgn)? a. 3n * 100n * lgn b. 3n^2 - 100n - 6 c. n^3 * lgn d. 3n * lgn + 100
d. 3n * lgn * 100
Sorting can be solved using the support of which data structure? a. hash tables b. heaps c. linked lists d. all of the above
d. all options are correct
In the sequence 11, 4, 20, 45, 32, 60, 98, 70: Which is the pivot element?
20
T/F Counting sort is an in-place algorithm
False
T/F There is only one kind of inductive method
False
T/F: B-trees work as binary trees but with different children
False
T/F: Brute force, greedy approach, and dynamic programming are techniques that can solve the same problem they deal with. The only difference is in efficiency
False
T/F: Every problem in theory of numbers can be proven by the inductive method
False
T/F: Heap sort is NOT a comparison based sorting algorithm
False
T/F: Heaps, trees, and hash tables are essentially the same data structures
False
T/F: Heapsort uses a heap to sort different elements. The use of a heap makes this algorithm the most efficient in the worst-case for sorting
False
T/F: If f(n) belongs to Ω(g(n)), then it also belongs to θ(g(n))
False
T/F: Quicksort is an in-place sorting algorithm
False
T/F: Randomly picking the pivot in quicksort increases the running time
False
T/F: The fractional knapsack problem cannot be solved using dynamic programming
False
T/F: The recursive algorithm implementation of insertion sort is faster than an iterative implementation
False
T/F: The worst-case running time and expected running time are equal to within a constant factor for any algorithm
False
T/F: When dealing with the longest common sequence problem, the brute force algorithm for 2 sequences of 5 chars each will never end.
False
Give the intermediate steps of the array A = [15, 20, 10, 18] when sorted with quick sort
Step 1. 10, 20, 15, 18 Step 2. 10, 15, 20, 18 Step 3. 10, 15, 18, 20
Give a recurrence that describes the worst-case running time for merge sort, and give its worst-case running time using θ-notation
T(n) = 2T(n/2) + θ(n) = θ(nlgn)
Give a recurrence that describes the worst-case running time for Strassen's algorithm, and give its worst-case running time using θ-notation
T(n) = 7T(n/2) + θ(n^2) = θ(n^lg7)
Use the substitution method to give a tight upper bound on the solution to the recurrence T(n) = T(n/2) + T(n/4) + n using big-O notation
T(n) = O(nlgn)
Give a recurrence that describes the worst-case running time for binary search, and give its worst-case running time using θ-notation
T(n) = T(n/2) + θ(1) = θ(lgn)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = T(sqrt(n)) + θ(lglgn)
T(n) = θ((lglgn)^2)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = T(n/2) + 2^n
T(n) = θ(2^n)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = T((n/2 + sqrt(n)) + sqrt(6046)
T(n) = θ(lgn)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = T(n-2) + lgn
T(n) = θ(lgn)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = 3T(n/5) + lg^2n
T(n) = θ(n^(log3(3)))
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = 7T(n/2) + n^3
T(n) = θ(n^3)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = 10T(n/3) + 17n^1.2
T(n) = θ(n^log3(10))
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = sqrt(n) * T(sqrt(n)) + 100n
T(n) = θ(nlglgn)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = 2T(n/3) + nlgn
T(n) = θ(nlgn)
Give asymptotic upper and lower bounds for T(n) for the recurrence T(n) = T(n/5) + T(4n/5) + θn
T(n) = θ(nlgn)
T/F Computational intractable problems can be proven to be so by using induction
True
T/F The gauss formula can be proven by induction
True
T/F data structures are structures in memory that can help the algorithm solve problems efficiently
True
T/F: A function f(n) that is θ(n) is also O(n)
True
T/F: Any dollar sum greater than 12 can be formed by the combination of 4 and 5 dollar coins. This can be proven by induction
True
T/F: Counting sort always obtains in the worst-case the complexity O(n), where n = the number of elements in the input list
True
T/F: Directed graphs have arrows and undirected graphs do not
True
T/F: Every odd number n can be represented by 2k+1 for some integer value k, and this can be proven by induction
True
T/F: If f(n) belongs to θ(g(n)), then it belongs to O(g(n))
True
T/F: In order to prove a problem can be solved with dynamic programming, it is only necessary to prove the principle of optimality
True
T/F: Mathematical induction in some form is the foundation of all correctness proofs for computer programs
True
T/F: The inductive method shows a few gaps and does not always ensure a valid solution when proven
True
T/F: f(n) = n^2-nlgn Ω(n^2)
True
Which of the following is not O(n^2) a. (n/3)/(sqrt(n)) b. n + 10000n c. 105n + 26n d. n*1.9999
a. (n/3)/(sqrt(n))
A graph is: a. a set of vertices and edges b. a balanced b-tree c. a hash tree d. a green tree
a. a set of vertices and edges
Which strategy is more efficient for solving the 0-1 knapsack problem? a. dynamic programming b. greedy c. brute force d. trial and error
a. dynamic programming
The shortest path problem can be solved in any case by: a. dynamic programming and brute force b. brute force c. greedy d. dynamic programming e. dynamic programming and greedy
a. dynamic programming and brute force
What is the most efficient technique in terms of time and space: a. greedy b. dynamic programming c. brute force d. none of the options are correct
a. greedy
With counting sort, in the worst-case it is sometimes possible to reduce the order of growth to a linear complexity using: a. hash tables b. heaps c. binary trees d. b-trees
a. hash tables
Which is the worst algorithm in the worst-case: a. quicksort b. heapsort c. mergesort
a. quicksort
int fun2(int n) { if (n <=1) return n; return fun2(n-1) + fun2(n-1) } has a running time of: a. θ(2^n) b. θ(n) c. θ(lgn) d. θ(n^2)
a. θ(2^n)
int fun(int n) { int count = 0; for (int i = 0; i < n; i++) for (int j = i; j < 0; j—) count = count + 1; return count; } The running time is: a. θ(n^2) b. θ(n) c. θ(1) d. θ(nlgn)
a. θ(n^2)
Heaps are: a. a perfect binary tree b. a complete binary tree c. a complete b-tree d. a balanced tree
b. a complete binary tree
Proving the principle of optimality is necessary to build the following approach: a. greedy b. dynamic programming c. both d. neither
c. both
The inductive method consists of two steps:
Base case step, inductive step
Which sorting algorithm in its typical implementation gives the best performance when applied on an array that is sorted or almost ( 1 or 2 elements misplaced) sorted?
Insertion sort