Dynamic Programming

Ace your homework & exams now with Quizwiz!

How to tell if a problem is good for recursion?

"Design an algorithm to compute the N" "Write code to list the first N" "Implement a method to compute all..."

Top-Down w/Memoization vs Bottom-Up

*same asymptotic runtime 1. Bottom-up more efficient by constant factor b/c no overhead for recursive calls 2. Bottom-up may benefit from optimal memory accesses 3. Top-down can avoid computing solutions of sub-problems that are not required 4. Top-down is more natural Top-Down: start w/whole weight & work backwards (return q, which = r[n]) Bottom-Up: start w/base case and work upwards until reach solution (return r[n])

What is the difference between divide and conquer and dynamic programming algorithms?

- divide and conquer solves disjoint subproblems recursively and combines the solutions, but often solves common subproblems repeatedly and unnecessarily - dynamic programming applies when subproblems overlap (when subproblems share subsubproblems) and saves each subproblem solution in table if needed again

What are the ways to implement a dynamic programming approach?

- top-down memoization, recursive where previous subproblem solutions are saved and looked up prior to solving a new subproblem - bottom-up method, requires some notion of size so solving problems can be combined with smaller subproblems

Recognizing a dynamic programming problem

-If you are redoing work for subproblems -Recursive Solution -Big O of n^2 is only solution and the array must stay in order (and is not sorted)

Developing DP Algorithm

1. Characterise structure of an optimal solution 2. Recursively define value of an optimal solution - test your definition very carefully! 3. Compute value of an optimal solution (typ bottom-up fashion) 4. Construct optimal solution from computed information

Requirements

1. Optimal substructure 2. Overlapping sub-problems

Dynamic paradigms Dynamic programming

Break up a problem into a series of overlapping subproblems, and build up solutions to larger and larger subproblems.

Dynamic paradigms Divide-and-conquer

Break up a problem into independent subproblems, solve each subproblem, and combine solutions to subproblems to form a solution to original problem E.g. Merge-sort, quicksort

Dynamic paradigms Greedy

Build up a solution incrementally, myopically optimizing some local criterion

Pattern in question to identify DP questions?

Look for these words: length of smallest sub sequecne, fewest number of, the largest product, total number of ways to,maximum number of,

How to make recursion faster ?

Memorize or bottom up

Dynamic Programming

Programs that can be solved recursively, but the sub-problems overlap. A DP program solves each subproblem once and saves the result in a table (avoid re-computations) Typically used for optimisation problems

Hint for recursive problem

Such that it can be built off of sub problems.

second step in determining optimal substructure?

assume you are a given a choice that leads to an optimal solution for the given problem

Bottom-up Approach

build the solution for one case off of the previous case.

How to approach Recursion

compute f(n) by adding something, removing something, or change the solution for f(n-1) in some case, solve for first half of the data set, then second half. Common approach : Bottom-up, top-down, and half-half.

Top Down Approach

divide the problem for case N into subproblems. be-careful of overlap between the cases.

Half and half approach

example, binary search (first figure out which half of the array contains the value, then recursively search for that value) and merge sort(sort each half of the array and then merge together the sorted halves).

Momo-ize

if get subproblem solution , write down for furture use

Optimal substructure

optimal solution contains optimal solution to sub-problems

first step in determining optimal substructure?

show that a solution involves making a choice that leaves one or more subproblems to be solved

Overlapping sub-problems

solution combines solutions to overlapping sub-problems space of subproblems must be "small", subproblems solved over and over, rather than generating new subproblems

informally, how is the running time of a dynamic programming algorithm determined?

two factors: - the number of subproblems - number of choices within each subproblem # Time Complexity = subproblems * time done on each

fourth step in determining optimal substructure?

use 'cut-and-paste' technique to prove by contradiction that your optimal solution is or isn't optimal and pasting in optimal

third step in determining optimal substructure?

with given choice, determine which subproblems exist


Related study sets

Breaking Into Wall Street Basic Questions

View Set

AP GOV: TINKER V. DES MOINES (1969)

View Set

BIOL 1209 LAB FINAL TA MON. Part 2

View Set