Algorithms Exam 2 Review: Dynamic Programming, Greedy Algorithms, & Amortized Analysis

¡Supera tus tareas y exámenes ahora con Quizwiz!

Greedy Algorithms

Always makes the choice that looks best at the moment. That is, it makes a locally optimal choice in the hope that this choice will lead to a globally optimal solution. Do not always yield an optimal solution, but sometimes they do. Usually works for optimization problems.

First step of developing a dynamic programming algorithm:

Characterize the structure of an optimal solution

Assignment 4 9. Suppose we perform a sequence of n operations on a data structure in which the 𝑖th operation costs 𝑖 if 𝑖 is an exact power of 2, and 1 otherwise. Use aggregate analysis to determine the amortized cost per operation.

Ci : cost of ith operation Ci = { 1, if i is not power of 2 { i, otherwise for n operations T(n) = = total cost of operations power of 2 total cost of operations; not power of 2 if n = 4 1, 2, 3, 4 n = 8 Given n operations, 1. # of operations power of 2: + 1 2. # of operations not power of 2: n - - 1

Third step of developing a dynamic programming algorithm:

Compute the value of an optimal solution, typically in a bottom up fashion

Fourth step of developing a dynamic programming algorithm:

Construct an optimal solution from computed information

Assignment 3 1. Algorithms designed using Dynamic Programming is faster than ALL algorithms designed recursively.

False. Example: Use dynamic programming when there are overlaps for sub problems. Use divide and conquer (like merge sort) for other things

Assignment 3 7. There can be only one longest common sub-sequence (LCS) for any two distinct strings.

False. This is for an optimization problem

Assignment 3 6. In the longest common subsequence (LCS) example from class, the LCS of two strings X and Y must contain elements that are consecutive from the original strings.

False. They don't have to be consecutive

Highest peak in a greedy algorithm:

Global optimal solution

Dynamic Programming

Like the divide and conquer method, solves problems by combining the solutions to sub problems

Smaller peaks in a greedy algorithm:

Local optimal solution

For the 0-1 knapsack problem, is Greedy Algorithm a good choice?

No

We typically apply dynamic programming to ____________________

Optimization problems

Second step of developing a dynamic programming algorithm:

Recursively define the value of an optimal solution

Assignment 3 9. A palindrome is a nonempty string over some alphabet that reads the same forward and backward. Examples of palindromes are all strings of length 1, "civic", "racecar", and "aibohphobia". Give an efficient dynamic algorithm that can return the length of the longest palindrome that is a subsequence of a given input string. For example, given the input "character", your algorithm should return 5, which is the length of "carac". Give the running time of your algorithm.

Step 1: f(i, j) : longest subsequence palindromes from char i to char j Step 2: f(i, j) = {1, if i = j {0, if i > j {f(i + 1, j - 2) if A[i] = A[j] Max(f(i + 1, j), f(i, j - 1) if A[i] =/= A[j] Max (f(i + 1, j) (f(i, j - 1) int LSP (A, i, j){ if (i = j) {return 1;} else if (i > j) {return 0;} else if (A[i] = A[j]) {return LSP (A, i + 1, j - 1) + 2; } else { B = LSP (A, i + 1, j); C = LSP (A, i, j - 1); if (B > C) {return B;} else return C; } } Step 3: create two dimensional array and convert this shit to Dynamic Programming int LSP (A, i, j){ D[1, ..., n] [1, ..., n] = {-1} if (D[i][j] =/= -1) return D[i][j]; if (i = j) {D[i][j] = 1; return D[i][j] = 1;} else if (i > j) { D[i][j] = 0; return D[i][j];} else if (A[i] = A[j]) {D[i + 1][j - 1] = LSP (A, i + 1, j - 1) + 2; return D [i + 1] [j - 1];} else { D[i + 1][j] = LSP (A, i + 1, j); D [i][j - 1] = LSP (A, i, j - 1) if (D[i + 1] [j] > 0[i][j - 1] { return D[i + 1] [j]; else return D[i] [j - 1]; }

Assignment 3 3. In the Rod Cutting example from the class, the top-down without memoization algorithm has time complexity of n^2.

True.

Assignment 3 4. In the Rod Cutting example from the class, the bottom-up method algorithm has time complexity of n^2.

True.

Assignment 3 5. Elements of dynamic programming include: 1 optimal substructure, 2) overlapping sub-problems, 3) Memoization.

True.

Assignment 4 1. A greedy algorithm always makes the choice that looks best at the moment, that is why it can always lead to a locally optimal solution.

True.

Assignment 4 3. The Activity-selection problem that we went over in the class cannot be solved using dynamic algorithms.

True.

Assignment 4 4. The Huffman codes algorithm that we went over in class has time complexity of n lg n

True.

Assignment 4 7. In an amortized analysis, we average the time required to perform a sequence of operations.

True.

Amortized Analysis

We average the time required to perform a sequence of data structure operations over all the operations performed

For the fractional knapsack problem, is Greedy Algorithm a good choice?

Yes

Assignment 4 8. What is an optimal Huffman code for the following set of frequencies? a:1 b:1 c:2 d:3 e:5 f:8 g:13 h:21

h: 0 g: 10 f: 110 e: 1110 d: 11110 c: 111111 a: 1111100 b: 1111101

Elements of Greedy Strategy

1. Determine the optimal substructure of the problem 2. Develop a recursive solution 3. Show that if we make the greedy choice, then only one sub problem remains 4. Prove that it is always safe to make the greedy choice 5. Develop a recursive algorithm that implements the greedy strategy 6. Convert the recursive algorithm to an iterative algorithm

Elements of Dynamic Programming

1. Optimal Structure 2. Overlapping sub problems 3. Memoization

Assignment 3 2. Recursive algorithms that produce duplicate sub-problems are slower than Dynamic Programming because DP ignores the duplicate sub-problems thus saving computing times.

False.

Assignment 4 2. A greedy algorithm always makes the choice that looks best at the moment, that is why it can ALWAYS lead to a globally optimal solution.

False.

Assignment 4 5. It is always safe to use greedy algorithm in terms of obtaining an optimal solution.

False.

Assignment 4 6. The 0-1 knapsack problem can be better solved using greedy algorithms.

False.

Assignment 3 8. The Fibonacci numbers are defined by recursion: f(n) = f(n-1)+f(n-2). Give an O(n)-time dynamic programming algorithm to compute the nth Fibonacci number.

int fib(n, a, b) { if (n = 1) { return a;} else if (n = 2) { return b;} else { return fib (n - 1, a, b) + fib (n - 2, a, b); } } the time complexity is ass because you have to keep doing this shit over and over again. You need to use dynamic programming to make this algorithm great again! Dynamic programming version: A[1, ..., n] = {-1}; int fib(n, a, b) { if (A[n] =/= -1) { return A[n]; } if (n = 1) { A[n] = a; return A[n];} else if (n = 2) { A[n] = b; return A[n];} else { A[n - 1] = fib (n - 1, a, b); A[n - 2] = fib (n - 2, a, b); return A[n - 1] + A[n - 2]; } }


Conjuntos de estudio relacionados

APUSH Chapter 10 Learning Curve Study Guide

View Set

Ch 1 - A Closer Look: The Food Environment and Food Choices - Attempt 1

View Set

ATI RN Mental Health Nursing Modules Ch. 10 Notes

View Set

PEDS: Ch. 16 Nursing Care of the Child With an Alteration in Intracranial Regulation/Neurologic Disorder

View Set

Ch 27: Seedless Plants-questions

View Set

Chapter 23: The Great Depression

View Set

WGU C963 American politics and US Constitution

View Set