Divide and Conquer, Sorting and Searchings Algos

Ace your homework & exams now with Quizwiz!

Running time of merge sort

Claim: For every input array of n numbers, Merge Sort produces a sorted output array of and uses AT MOST : 6n * log2(n) + 6n operations

Roughly how many levels does a merge sort recursion tree have (as a function of n, the length of the input array)

log2 (n) (log2 n + 1 to be exact)

Question 1 3-way-Merge Sort : Suppose that instead of dividing in half at each step of Merge Sort, you divide into thirds, sort each third, and finally combine all of them using a three-way merge subroutine. What is the overall asymptotic running time of this algorithm? (Hint: Note that the merge step can still be implemented in O(n)O(n) time.)

nlog(n) -- overall amount of work is still linear

Omega Notation

T(n) = Ω(f(n)) if and only if 2 constants c,n0 such that T(n) >= c * f(n) for all n greater than n0

Theta Notation

T(n) = θ(f(n)) if and only if T(n) = O(f(n)) AND T(n) = Ω(n)

Big-O (formal def)

T(n) = O(f(n)) if and only if there exist constants c,n0 > 0 such that T(n) <= c * f(n) for all n>= n0

Question 4 k-way-Merge Sort. Suppose you are given k sorted arrays, each with n elements, and you want to combine them into a single array of kn elements. Consider the following approach. Using the merge subroutine taught in lecture, you merge the first 2 arrays, then merge the 3rd given array with this merged version of the first two arrays, then merge the 4th given array with the merged version of the first three arrays, and so on until you merge in the final (kth) input array. What is the running time taken by this successive merging algorithm, as a function of kk and n? (Optional: can you think of a faster way to do the k-way merge procedure ?)

θ(n(k^2)) - for the upper bound, merged list size is always O(kn), MERGING IS LINEAR. For lower bound, each of the last k/2 merges take horseshoe (kn) time.

Let T(n) = \frac{1}{2}n^2 + 3nT(n)=21​n2+3n. Which of the following statements are true (Check all that apply.) - T(n) = 0(n) - T(n) = Ω(n) - T(n) = θ(n^2) - T(n) = O(n^3)

- T(n) = Ω(n) - T(n) = θ(n^2) - T(n) = O(n^3)

Question 3 Assume again two (positive) nondecreasing functions f and g such that f(n)=O(g(n))f(n)=O(g(n)). Is 2^{f(n)}=O(2^{g(n)})2f(n)=O(2g(n)) ? (Multiple answers may be correct, you should check all of those that apply.)

- Yes if f(n)≤g(n) for all sufficiently large n - Sometimes yes and sometimes no depending on f and g

Little o notation

- strictly lower than - T(n) = o(f(n)) if and only if for all constants c>0 there is a constant n0 such that T(n) <= c * f(n) for all n > n0

What is the pattern ? Fill in the blanks in the following statement: at each level j=0,1,2,..,log_2(n)log2​(n) there are _____ subproblems, each of size ______.

2^j and n/(2^j) respectively

You are given functions ff and gg such that f(n)=O(g(n))f(n)=O(g(n)). Is f(n)*log_2(f(n)^c) = O(g(n)*log_2(g(n)))f(n)∗log2​(f(n)c)=O(g(n)∗log2​(g(n))) ? (Here cc is some positive constant.) You should assume that ff and gg are nondecreasing and always bigger than 1.

True -- because the constant c in the exponent is inside a logarithm, it becomes a leading constant and gets suppressed by big O


Related study sets

Financial Markets and Institutions Exam 2

View Set

RN Maternal Newborn Online Practice 2023 A

View Set

Ch 28 Infection Prevention and Control

View Set

Chapter 4- Radiographic imaging and exposure

View Set

AP Human Geography Unit 7 Study Set

View Set

Supply Chain Management Chapter 1 T/F

View Set

Legal 215: Chapter 7 - the Unauthorized Practice of Law

View Set