CS550 03-Asymptotic Complexity

¡Supera tus tareas y exámenes ahora con Quizwiz!

Worst-case running time of Insertion Sort PROVING BIG OMEGA

"At least" n-1 positions, so using arithmetic geometric series the sum is (n-1)*n / n Worst case is AT LEAST this so omega (n^2)

Worst-case running time of Insertion Sort PROVING BIG OH

"at most" >= (upper bound) and inner * outer is n^2 - 2n + 1 is critical

Running time for Insertion Sort

Running time of Insertion Sort is 𝑂(𝑛^2) and Ω(𝑛)

Show that 𝑛3 − 100𝑛2 ∉ 𝑂(𝑛^2)

Set f(n) <= cn^2 Show impossible

Show that 4𝑛^2 + 100^𝑛 + 500 ∈ 𝑂(𝑛^2)

Set f(n) <= cn^2 Find appropriate c and n_0

Show that 4𝑛^2 + 100^𝑛 + 500 ∈ Ω(𝑛^2)

Set f(n) >= cn^2 Find appropriate c and n_0

Θ(𝑔(𝑛)) bound on the worst-case running time ⇏ Θ(𝑔(𝑛)) bound on the running time of every input

even if you have a tight bound Θ(𝑔(𝑛)) on the worst-case running time, it doesn't necessarily mean that the same bound applies to every possible input. The worst-case and average-case behaviors can be different

Running time is Ω(𝑔(𝑛)) ⇒ Best case is Ω(𝑔(𝑛))

if the running time of Insertion Sort has a lower bound of Ω(𝑔(𝑛)), then the best-case running time (the fastest possible scenario) is also bounded from below by Ω(𝑔(𝑛)). if you have established a lower bound Ω(𝑔(𝑛)) on the running time of Insertion Sort, you can still state that the worst-case running time is also Ω(𝑔(𝑛)) while you can conclude that the worst-case running time of Insertion Sort has a lower bound of Ω(𝑔(𝑛)), you might not have a specific function g(n) that describes it. It is sufficient to know that the worst-case running time is at least as bad as some function 𝑓(𝑛) within the class of Ω(𝑔(𝑛))

Running time is 𝑂(𝑔(𝑛)) ⇒ Worst-case running time is 𝑂(𝑔(𝑛))

if the running time of Insertion Sort is bounded by the big O notation 𝑂(𝑔(𝑛)), then the worst-case running time of Insertion Sort is also bounded by 𝑂(𝑔(𝑛))

𝑂(𝑔(𝑛)) bound on the worst-case running time ⇒ 𝑂(𝑔(𝑛)) bound on the running time of every input

if you have established that the worst-case running time of Insertion Sort is 𝑂(𝑔(𝑛)), then this bound applies to the running time for any input, not just the worst-case input.

Monotonically increasing

if 𝑚 ≤ 𝑛 ⟹ 𝑓 (𝑚) ≤ 𝑓(𝑛)

lim as n-> ∞ of (lg^a n / n^b)

lim as n-> ∞ of (lg^a n / n^b) = 0

log_b(a)

log_c(a)/log_c(b)

𝑝 (𝑛) is a polynomial in 𝑛 of degree 𝑑 if

simplified from slides: if the highest power of n with a non-zero coefficient in the polynomial is n^d. In other words, the degree of a polynomial is determined by the exponent of the highest-order term in the polynomial. A function 𝑓 (𝑛) = 𝑂(𝑛^𝑘) for a constant 𝑘 is called polynomially bounded

log_b(x^y)

ylog_b(x)

Θ-Notation Definition

Θ (𝑔( 𝑛)) is the set of all functions whose rate of growth is the same as 𝑔(𝑛). f(n) = theta(g(n) IFF also O(g(n)) and f(n) = OMEGA(g(n))

Worst-case running time of Insertion Sort

Θ(𝑛^2) b/c it is 𝑂 𝑛^2 and Ω 𝑛^2 prove big oh and big omega to prove big theta WE CAN NOT SAY RUNNING TIME IS THETA(n^2), best case running time is an+b for some input SO Running time of Insertion Sort is 𝑂(𝑛^2) and Ω(𝑛)

Θ-Notation (Big-Theta)

Θ-notation characterizes a tight bound on the asymptotic behavior of a function A function grows precisely a certain rate (based on higher order terms) A function is Θ(𝑔 𝑛 ) if it is both 𝑂(𝑔 𝑛 ) and Ω(𝑔 𝑛 )

Ω-Notation Definition

Ω 𝑔 𝑛 is the set of all functions whose rate of growth is the same as or higher than that of 𝑔(𝑛). Notation: f(n) element of Omega(n)

Ω-Notation (Big-Omega)

Ω-notation characterizes a lower bound on the asymptotic behavior of a function A function grows no slower than a certain rate (based on higher order terms) A function is Ω(𝑔 𝑛 ) if it grows no slower than 𝑐 ⋅ 𝑔 𝑛 for constant 𝑐

𝑂-Notation Definition

𝑂 (𝑔 (𝑛)) is the set of all functions whose rate of growth is the same as or lower than that of g(n). We should write𝑓 (𝑛) ∈ 𝑂(𝑔(𝑛) Note that this definition requires 𝑓 (𝑛) and 𝑔(𝑛) to be asymptotically non-negative 𝑓 (𝑛) ≥ 0, as 𝑛 → ∞

𝑂-Notation (Big-Oh)

𝑂-notation characterizes an upper bound on the asymptotic behavior of a function • A function grows no faster than a certain rate (based on higher order terms) • A function is O(𝑔 (𝑛) ) if it grows no faster than 𝑐 ⋅ 𝑔 (𝑛) for constant 𝑐

𝑓(𝑛) = 7𝑛^3 + 100𝑛^2 − 20𝑛 + 6 Big O?

𝑓 (𝑛) is 𝑂(𝑛^3) 𝑓(𝑛) is also 𝑂(𝑛5), 𝑂(2𝑛), and 𝑂 𝑛! BECAUSE ALL THESE c*g(n) GROW FASTER THAN f(n) for a constant c

𝑓(𝑛) = 7𝑛^3 + 100𝑛^2 − 20𝑛 + 6 Big Omega?

𝑓 𝑛 is Ω(𝑛^3) 𝑓(𝑛) is also Ω(𝑛^2), Ω(𝑛), and Ω log 𝑛 A function f(n) is Ω(𝑔 𝑛 ) if it grows no slower than 𝑐 ⋅ 𝑔 (𝑛) for constant 𝑐, and when g(n) is n, n^2 or n lg n, g(n) times any c will always grow slower than n^3

Strictly increasing

𝑚 < 𝑛 ⟹ 𝑓 (𝑚 ) < 𝑓 (n ) ex: true for 𝑓(𝑛) = 𝑛^2 where 𝑛 ≥ 0

strictly decreasing

𝑚 < 𝑛 ⟹ 𝑓 (𝑚 ) > 𝑓 (n )

Monotonically decreasing

𝑚 ≤ 𝑛 ⟹ ≥ 𝑓(𝑛)

If𝑓𝑛 =𝑂𝑔(𝑛) and𝑓𝑛 =Ω𝑔(𝑛),then𝑓𝑛 =Θ𝑔(𝑛):

If big omega, then there exists c_1, n_0 greater than 0 such that for all n greater than n_0, 0<= c_1 *g(n) <= f(n) If big O, then there exists c+2, n_0 > 0 such that for all n >= n_0, f(n) <= c_2 * g(n) Let n_0 = max {n'_0, n''_0} So there exists a constant 1 and constant 2 and n_0 such that for all n greater than n_0, 0<=c_1 * g(n) <= f(n) <= c_2*g(n) So f(n) = Theta g(n))

If𝑓𝑛 =Θ𝑔(𝑛),then𝑓𝑛 =𝑂𝑔(𝑛) and𝑓𝑛 =Ω𝑔(𝑛):

If f(n) = Theta(g(n)) then there exists a constant 1 and constant 2 and n_0 such that for all n greater than n_0, 0<=c_1 * g(n) <= f(n) <= c_2*g(n) So there exists c_1, n_0 greater than 0 such that for all n greater than n_0, 0<= c_1 *g(n) <= f(n) SO f(n) = OMEGA(g(n)) Also there exists c+2, n_0 > 0 such that for all n >= n_0, f(n) <= c_2 * g(n) SO f(n) = O(g(n))

Sorting Problem Asymptotic Complexity

Insertion sort takes Q(n2) in the worst case so sorting (as a problem) is O(n^2) Any sorting algorithm must look at each item, so sorting is Ω(n)

polylogarithmically bounded

Log bases do not matter in asymptotic notations

𝜔-Notation

Non-tight lower bound Intuitively, 𝑓 𝑛 becomes arbitrarily large w.r.t 𝑔(𝑛) as 𝑛 → ∞ lim 𝑛→∞ 𝑓(𝑛)/𝑔(𝑛) = ∞

𝑜-Notation

Non-tight upper bound Intuitively, 𝑓 𝑛 becomes insignificant w.r.t 𝑔(𝑛) as 𝑛 → ∞ lim 𝑛→∞ 𝑓(𝑛)/𝑔(𝑛) = 0

Exponentials

Functions that have 𝑎^𝑛 terms (𝑎 is a constant) Exponentials: constant base, variable exponent

𝑛^𝑏 = 𝑜(𝑎^𝑏) - why?

Exponential For any real constants 𝑎, 𝑏 with 𝑎 > 1 this is true because lim as n approaches 0 of n^b/a^b = 0

Best Case Running Time for Insertion Sort

Best-case running time is both 𝑂(𝑛) and Ω(𝑛) ⇒ Θ(𝑛)


Conjuntos de estudio relacionados

Mapa - La ubicación de los lugares

View Set

10 Basic Timer Control: On-delay and off-delay

View Set

Oregelbundna verb: catch, choose, come, creep, cut, do, draw, dream, drink, drive

View Set

A&P (Ch. 10 muscle tissue) w/prac questions

View Set

Addiction: Exemplar 22.B Nicotine Addiction

View Set