Algorithms and Data Structure Ch 4
Why is the bubble sort not a divide and conquer algorithm?
It does not divide the array into smaller arrays to sort
Which of the following is a characteristic of a greedy algorithm?
It looks for the local optimum step
What algorithm approach can be iterative or recursive?
Any
What is an optimal substructure?
It contains solutions to sub-problems
Can the binary search be considered a dynamic programming problem?
No
How is dynamic programming different from divide and conquer?
The subproblems must have optimal substructure and overlap
What is recursive ease?
A subproblem large enough to solve recursively
Why is a priority queue implemented using a binary search tree and not a queue?
Better Big O performance
What is an example algorithm that cannot be approached using dynamic programming?
Bubble sort
How can a dynamic programming algorithm approach memoization?
By storing the solution of a problem after solving it once
What is the combine step in the merge sort algorithm?
The merging step
In which case a greedy algorithm approach to a problem not work?
The problem does not have local optimum solution
Consider the following two sequences: X = < B, C, D, C, A, B, C >, and Y = < C, A, D, B, C, B >. What is the length of the longest common subsequence?
4
Why is a priority queue used with the Huffman coding algorithm?
To calculate priorities
What is the importance of algorithm design paradigms?
To make better decisions in algorithm choice and design
What is the master method used for?
To solve recurrences
When do we use the dynamic programming approach?
When the solution has an optimal substructure
Consider the maximum sub-array problem for the array [ -1, 1, 2, -3, 5, -6, 5, 3]; what is the maximal sub-array length?
4
Why does a greedy algorithm not use or require caching or memoization?
Because it makes an optimum selection locally that reaches a global optimum
Suppose that in the knapsack problem, you can break items so that you can take a fraction of the item's value and weight. Which algorithm design paradigm provides an efficient solution to this problem?
Divide and conquer
What is the algorithm approach for the problem of: "Given two jugs with the maximum capacity of i and j gallons respectively. Measure n gallons of water using these two jugs."
Dynamic programming
What is the algorithm paradigm or approach in this function? int algo(int n, int k) { if (k == 1 || k == 0) return k; if (n == 1) return k; int min = Integer.MAX_VALUE; int x, res; for (x = 1; x <= k; x++) { res = Math.max(algo(n-1, x-1), algo(n, k-x)); if (res < min) min = res; } return min + 1; }
Dynamic programming
For the following function, what is the algorithm paradigm? void algo(int idx, int[] num){ int min = Integer.MAX; int pos = -1; for(int x=idx;x<num.length;x++){ if(num[x] < min){ min = num[x]; pos = x; } num[pos] = num[idx]; num[idx] = min; if(idx < num.length) algo(idx+1, num); } }
Recursive
What is greedy with respect to Java?
Algorithm paradigm
What is significant about big-theta notation (Ө)?
Asymptotically tighter bound than O(n)
Why does making a sorting algorithm recursive not always make it a divide and conquer algorithm?
Recursion is not the only requirement for an algorithm to be of divide and conquer type
Which of the following is true about overlapping subproblems?
Same subproblem is solved repeatedly
What is another name for the Dijkstra's algorithm?
Single source shortest path problem
What data structure is required to take a recursive divide and conquer algorithm and implement it iteratively?
Stack
What is an example of a recurrence that is not solvable by the master method?
T(n) = 2T(n/2) + n log n
In an operating system like Unix, a common shared resource is the processor or CPU, and multiple processes compete for this common resource. What algorithm approach can be used?
Greedy
In a greedy method, how many feasible solutions do we get?
One or more
What are the necessary ingredients of a greedy algorithm?
Optimal substructure and greedy choice
What are the two ingredients for dynamic programming?
Optimal substructure and overlapping subproblems
What is the divide step of the quick sort?
Partitioning
Which of the following is/are property/properties of a dynamic programming problem?
They have both optimal substructure and overlapping subproblems
Can the insertion sort be considered a greedy algorithm?
No
Consider the coin change problem. What is the number of ways we can combine the coins for N = 100, S = { 25, 50 }?
3 the three solutions are: { 25, 25, 25, 25 } { 25, 25, 50 } { 50, 50 }
Consider the coin change problem. What is the number of ways we can combine the coins for N = 10, S = { 2, 5, 3, 6 } ?
5 The five solutions are: {2,2,2,2,2} {2,2,3,3} {2,2,6} {2,3,5} {5,5}
What is significant about big-omega notation (Ω)?
Asymptotically tighter lower bound that O(n)
What does the dynamic programming approach do to prevent solving the same subproblem repeatedly?
Cache solutions to subproblems
What are the steps in a divide and conquer approach?
Divide, conquer, and combine
What is Huffman coding?
Lossless data compression algorithm
What is the number of approaches to algorithm design?
Many
Which of the following are divide and conquer algorithms?
Merge sort and quick sort