CS 146

¡Supera tus tareas y exámenes ahora con Quizwiz!

Algorithmic complexity is concerned with ______________________

how fast or slow particular algorithm performs.

stacks

- container data structure - LIFO - very efficient - probably the right container to use when retrieval order does not matter at all - push: insert at top of stack - pop: return and remove top item

linear-time and n*lg(n) algorithms remain practical on inputs of _____________

1 billion

3 main advantages of contiguously-allocated arrays

1) constant-time access with an index 2) space efficiency 3) memory locality

data structures can either be classified as _________ or ___________

1) contiguously-allocated structures 2) linked data structures

divide and conquer steps

1) divide 2) conquer: conquers the subproblems by solving them recursively 3) combine: the solutions to the subproblems into the solution for the original problem

steps in solving recurrence relations by substitution

1) guess the form of the substitution 2) use induction to find the constants and show that the solution works - we substitute the guessed solution for the function when applying the inductive hypothesis to smaller - we must be able to guess the correct answer to apply this method and correctly prove our guess

two common problems when defining the problem an algorithm is supposed to solve

1) ill-defined questions 2) creating compound goals here is an example of a well-defined problem with compound goal, which is bad: find shortest path from a to b that does not use more than twice as many turns as necessary

relative advantages of arrays over linked lists include:

1) less space since pointers are not required 2) linked lists do not allow for random access to items 3) arrays allow better memory locality and cache performance than random pointer jumping

relative advantages of linked lists over static arrays include:

1) overflow cannot occur 2) insertions and deletions are simpler 3) with large records, moving pointers is easier and faster than moving the items themselves

two most useful techniques that allow us to compare the efficiency of algorithms

1) the RAM model of computation 2) asymptotic analysis of worst-case complexity

algorithms take roughly the same time for n = ?

10

quadratic-time algorithms whose running time is n^2 remain usable up to about n = ________________

10,000, but quickly deteriorate with larger inputs

when resizing a dynamic array each of the n elements move only _______ times on average, and

2

number of different sequences of bits with length of w

2^w

heap

A heap is a binary tree (in which each node contains a Comparable key value), with two special properties: 1) order: the value in n is greater than or equal to the values in its children (and thus is also greater than or equal to all of the values in its subtrees) 2) shape: All leaves are either at depth d or d-1 (for some value d). All of the leaves at depth d-1 are to the right of the leaves at depth d. (a) There is at most 1 node with just 1 child. (b) That child is the left child of its parent, and (c) it is the rightmost leaf at depth d.

priority queue implementation

A priority queue can be implemented using many of the data structures that we've already studied (an array, a linked list, or a binary search tree). However, those data structures do not provide the most efficient operations. To make all of the operations very efficient, we'll use a new data structure called a heap.

priority queue

A priority queue is different from a "normal" queue, because instead of being a "first-in-first-out" data structure, values come out in order by priority.

Divide and Conquer

A program design strategy in which tasks are broken down into subtasks, which are broken down into sub-subtasks, and so on, until each piece is small enough to code comfortably. These pieces work together to accomplish the total job.

what does this mean: 𝑇(𝑛)=2𝑇(𝑛/2)+𝑂(𝑛)

For any 𝑛, the time 𝑇(𝑛) needed to sort 𝑛 elements can be computed by taking the time 𝑇(𝑛/2) needed to sort 𝑛/2 elements, multiplying that time by 2, and adding something extra. That something extra must be at most linear in 𝑛

recurrence relation

An equation that is defined in terms of itself. Any polynomial or exponential can be represented by a recurrence.

queues

FIFO - fairest way to control waiting times for services - minimizes the maximum time waiting - enqueue: insert at back - dequeue: remove from front

Space Complexity

How much memory an algorithm needs.

benefits of implements dictionaries with sorted double-linkded list

O(1) for all operations except search and insert, but search is still the same runtime for the other implementations - consult pg 75 of Algorithm Design Manual for more information

O(g(n)) * O(f(n))

O(g(n) * f(n)) - this holds for big theta and omega

an ____________ algorithms hardly breaks a sweat for any value of n

O(lgn)

runtime of binary search

O(lgn)

(T or F) asymptotically tight bound says nothing about the possibility of having a faster algorithm in general.

T

(T or F) the base of a logarithms has no real impact on growth rate

T

what is the initial or boundary condition in a recurrence relation?

The initial or boundary condition terminate the recursion

pointers

are the connections that hold the pieces of linked structures together - represent the address of a location in the memory

(True or False) ceilings, floors, and boundary conditions do not USUALLY matter when solving recurrence relations

True

asymptotic notation

When we drop the constant coefficients and the less significant terms, we use asymptotic notation

f(n) = o(g(n)) is like:

a < b

f(n) = O(g(n)) is like:

a <= b

f(n) = Θ(g(n)) is like:

a = b

f(n) = ω(g(n)) is like:

a > b

f(n) = Ω(g(n)) is like:

a >= b

big-theta

a function f(n) belongs to the set Θ(g(n)) if there exist positive constants c and d such that it can be sandwiched between c*g(n) and d*g(n)

when do cubic (n^3) generally occur?

enumerating through all triple of items in a n-element universe and also in dynamic programming algorithms

dictionaries

abstract data types that retrieve based on key values or content

What does big O notation represent?

an upper bound, therefore the worst case scenario

linked data structures

composed of distinct chunks of memory bound together by pointers lists, trees, and graph adjacency lists

contiguously-allocated structures

composed of single slabs of memory (arrays, matrices, heaps, and hash tables)

dynamic arrays

arrays that can change size

big- Omega

asymptotic lower bound

RAM (Random Access Memory) model for comparing algorithm efficiency

each simple operation takes exactly one step and loops are not considered simple operations - also each memory access takes exactly one step and we have as much memory as needed - the RAM model makes no difference of whether or not storage takes place in the cache or the disk - every model us a size range of which it is useful

why do we estimate efficiency of each algorithm asymptotically ?

because algorithms take different speeds on different machines

big-Θ notation is used to give ________________________

both upper and lower bounds on a function.

formula for amount of leaves in a tree with d children

d^h

how to use a recursion tree to generate guesses

each node represents the cost of a single subproblem - we sum each level of the tree to obtain a set of per-level costs, and then we sum all the per-level costs to determine the total costs of all levels of recursion - we must eventually reach the boundary condition(s)

when do n! algorithms generally arise?

generating all permutations or orderings of n items

disadvantage of using dynamic arrays over regular arrays

guarantee that each array access takes constant time in the worst case

main difference between little-o and Big-o

in O(g(n)), the bound 0 <= f(n) <= cg(n) holds for some constant c >= 0, but in f(n) = o(g(n)), the bound 0 <= f(n) <= cg(n) holds for all constants c >= 0

dictionaries implemented by unsorted vs sorted arrays

insert and delete is faster for unsorted - successor, predecessor, max/min are faster for sorted

recurrence relation

is a way of recursively defining a function. For example, the recurrence relation T(n) = 4T(n / 3) + O(1)

motivation for amortized analysis

is that looking at the worst-case run time per operation, rather than per algorithm, can be too pessimistic

analysis of algorithms

is the determination of the computational complexity of algorithms, that is the amount of time, storage and/or other resources necessary to execute them. Usually, this involves determining a function that relates the length of an algorithm's input to the number of steps it takes (its time complexity) or the number of storage locations it uses (its space complexity).

amortized runtime complexity

is the function defined by a sequence of operations applied to the input of size a and averaged over time. - Amortized analysis considers both the costly and less costly operations together over the whole series of operations of the algorithm

ω-notation

lower bound that is not asymptotically tight (little-omega) 0 <= cg(n) <= f(n)

master theorem for divide-and-conquer recurrences

master theorem for divide-and-conquer recurrences

Asymptotic

means approaching a value or curve arbitrarily closely

We define time complexity as a _______________________

numerical function T(n) - time versus the input size n

formula for number of leaves in a binary tree

n = 2^h

any algorithm with n! running time becomes useless for _________

n >= 20

algorithms whose running time is 2^n become impractical for __________

n >= 40

do stacks or queues provide better average wait times

neither, its the same

do floors and ceilings matter in recursion trees or recurrence relations?

no, therefore we can ignore them

containers are distinguished by ______

particular order they support

dictionaries

permits access to data items by content - search, insert, and delete - sometimes: max/min, predecessor, successor

"container" data structures

permits storage and retrieval of data items independent of content

O(g(n)) is defined as _____________

set of functions which grow no faster g(n)

ways to implement a dictionary

sorted/unsorted array and single/double sorted/unsorted linked-list

compare sorted vs unsorted data structures

sorted: search operations are fast and maintenance is slow unsorted: opposite

We will represent the time function T(n) using __________________

the "big-O" notation to express an algorithm runtime complexity. For example, the following statement T(n) = O(n2) says that an algorithm has a quadratic time complexity.

We will measure time T(n) as __________________________

the number of elementary "steps" (defined in any way), provided each such step takes constant time

iterated log

the number of times the logarithm function must be iteratively applied before the result is less than or equal to 1

smaller upper bounds are _______

stronger statements

3 popular methods to solve recurrence relations

substitution, recursion-tree, and master method

big-omega and big-o relation to big theta

upper and lower limits to create the "sandwiched" region

Big-O notation is used to give ___________________

upper bounds on a function

little-o

used to denote an upper bound that is not asymptotically tight

disadvantage of contiguously-allocated arrays

we cannot adjust their size in the middle of a program's execution but can counter this by allocated more room in the array than necessary but this wastes space. This is why we have dynamic arrays

purpose of a recursion tree

we use it to get good guesses for solutions to recurrence relations which can be verified with substitution method

when do exponential (c^n) run-time algorithms arise?

when enumerating all subsets of n items

𝑂(𝑥^2) means......

𝑂(𝑥^2) means that it's no more than some constant times 𝑥^2 for all large enough 𝑥; "asymptotically tight" means it really is some constant times 𝑥^2 for large enough 𝑥 and not, say, some constant times 𝑥1.999


Conjuntos de estudio relacionados

EXAM 2 study guide/ practice exam

View Set

Chapter 1: Comparing Earth & Rocky Planets

View Set

Water Treatment Class E Cheat Sheet

View Set

Ohio Life, Accident & Health Insurance Exam (Series 11-35), Life Insurance Basics

View Set

aPHR Study Guide - Functional Area 2: Recruitment and Selection

View Set