Test 1--Algorithms

¡Supera tus tareas y exámenes ahora con Quizwiz!

How can you analyze a graph to visualize growth orders?

As you go to the right, a faster growing function eventually becomes larger;

What is asymptotic analysis?

Comparison of running times of functions in the limit (asymptotically); measure of how fast each function grows; as 'm' goes to infinity (big-O basically)

Logarithmic Efficiency

Cuts problem size by constant fraction on each iteration

Karatsuba's multiplication algorithm

Formulated the first integer multiplication algorithm to break the O(n^2) barrier; Based on 2 key ideas: divide and conquer strategy and math trick due to Gauss

What is prime factorization?

Integer factorization where the integers are prime numbers

Worst-case complexity

Most commonly used, an upper bound on running time for any possible input of size n

What is the first abstraction of complexity analysis?

Operation count: identify key operation of algorithm; count number of times operation is performed

What are properties of logarithms?

Product rule, quotient rule, power rule

What is the greatest common divisor?

The largest positive integer that divides two integers without a remainder

When the running time T(n) of an algorithm depends on other characteristics of the input besides its size n, we distinguish between what kinds of analysis?

Worst-case complexity, best-cast complexity, average-case complexity

Worst-case time complexity of a computational problem

is the worst-case complexity of the best algorithm that can solve the problem

Towers of Hanoi recursive complexity

2^n - 1 = O(2^n)

Napier's Bones

A mechanical device for doing multiplication and division based on the lattice method was invented by John Napier around 1600

What other subject did Al-Kwarizimi influence?

Algebra

Factorial Efficiency

Algorithm generates all permutations of n-element set

Exponential Efficiency

Algorithm generates all subsets of n-element set

Constant Efficiency

Algorithm ignored input (i.e., can't even scan input)

Linear Efficiency

Algorithm scans its input (at least)

What is a search algorithm?

An algorithm that searches for an item in an array, or linked list, or more complex data structure ( binary tree, AVL tree, etc.)

What is Big-Omega?

Asymptotic lower bound;

Insertion Sort Analysis

Best case: O(n), worst case: O(n^2), average case O(n^2)

What is the essential feature of an algorithm?

Can be executed automatically, in a machine-like way

Sequential search

Check every element of a list sequentially until a match is found; best case: bigTheta(1), worst case: bigTheta(n)

Master Theorem

Cookbook approach for solving recurrences of the form T(n) = aT(n/b) + bigTheta(n^d) where a>=1, b>1 and d are constants

What is integer factorization?

Decomposition of a composite number into a product of smaller integers (easier than GCD)

Exponential Algorithm

Ex: Towers of Hanoi; exponential algorithms are not practical. If the disk moves at a rate of 1 disk per second, then it would take 2^32 = 136 years to move 32 disks

Average-case complexity

Expected performance averaged over all possible inputs of size n; very useful but much more difficult analysis

What are two ways to analyze algorithm scalability?

Experimentation: implement algorithm and run it with different input sizes; complexity: abstract form of analysis that doesn't require implementation

What is the inverse of logarithmic growth?

Exponential. Logarithmic is extremely slow, exponential is extremely fast;

Selection Sort

Find the smallest element in the array, exchange it with the element in the first position, find the second smallest element and exchange it with the element in the second position, continue until the array is sorted

Other than computers, what else have algorithms been run on?

Humans! Humans did computing back in the day

What is a slide rule?

Invented by Napier; used for calculations

Why study sorting?

It is the best studied problem in computer science, with a variety of different algorithms; most of the interesting ideas we will encounter in the course can be taught in the context of sorting such as divide- and-conquer, randomized algorithms, and lower bounds

What reduced multiplication and division times?(change)

Logarithmic stuff (change this I wasn't paying attention)

Quadratic Efficiency

Loop inside loop = "nested loop"

Cubic Efficiency

Loop inside nested loop

Best-case complexity

Lower bound on running time for any possible input of size n; usually not very informative and rarely used

What have algorithms been used for in the past thousands of years?

Mathematics: addition, subtraction, multiplication, division, square roots, equations, prime numbers, pie (yum)

Who is Alan Turing?

Played by Benedict Cumberbatch (it was gr8). Gave first formal definition of 'algorithm' by creating the turing machine

What is Moore's Law?

Predicts the number of transistors that fit on a computer chip will double every one and a half to two years;

n-Log-n Efficiency

Some divide and conquer algorithm

Where did the word computer come from?

Someone who computes, it was a job. Electronic computer distinguished between human computer and nowadays computer (was hella big though)

Egyptian Multiplication

Start with two numbers. On the left start with 1, on the right use the multiplicative. Keep doubling both numbers until the left side gets a close as possible to the original multiplicative but not larger. Subtract the left side numbers from the original multiplicative until you reach 0. Start the left side numbers that are being subtracted. Add the corresponding right side of numbers of the starred positions. Complexity O(n^2)

Towers of Hanoi

There an n disks labeled 1,2,3,...,n and three towers labeled A,B, and C; No disk can be on top of a smaller disk at any time; all the disks are initially placed on tower A, Only one disk can be moved at a time, and it must be the top disk on the tower

Interpolation Search

Uses knowledge about the distribution of values in an array to improve the efficiency of the search; Average case O(log(log(n))), worst case O(n)

Recursive Algorithm

a form of decomposition where a problem is solved by decomposing it into one of more simpler problems hat have the same form as the original

Recursive Function

a function which calls itself somewhere in the function body

Tree Recursion

a recursive function calls itself two or more times before returning a value

Recursion

a technique for defining data structure or algorithms in terms of themselves

What is the average case for quicksort?

bigTheta(nlgn)

Quicksort

divide-and-conquer algorithm; partition the array into two subarrays. Sort the two subarrays by recursive calls to quick sort. The subarrays are sorted in place. No extra work is needed to combine them

What is an algorithm?

finite set of precise instructions for performing computation something 'cause he flipped the fukkin slide too fast

Divide and conquer

important strategy for algorithm design that is typically implemented using tree recursion

Worst-case time complexity of an algorithm

its running time on the worst-case input instance

What are 'keys' for asymptotic notation?

lilO() is <; lilOmega() is >; theta() is =; bigO() is <=; bigOmega() is >=;

State the order of growth rates from slowest to fastest.

log(n) --> n --> n*log(n) --> n^2 --> n^3 --> 2^n --> n!

Sorting

sort a sequence of n elements in non-decreasing order

Linear Recursion

the simplest form of recursion, where the recursive function makes at most one recursive call each time it is invoked

What does the running time of quicksort depend on?

whether the partition is balanced or not

What are the 3 steps for divide and conquer?

1) Divide the problem into two or more smaller instances of the same problem, ideally of about the same size; 2) conquer the sub problems by solving them recursively. If they are small enough, just solve them in a straightforward manner; 3) Combine the solutions to create a solution to the original problem

Kolmogorov

A famous Russian mathematician and one of the founders of algorithmic complexity theory, conjectured in 1952 that grade-school multiplication is asymptotically optimal, which means that any multiplication algorithm must perform bigTheta(n^2) elementary equestions

Tail Recursion

An algorithm uses tail recursion if it both uses linear recursion and makes a recursive call as its very last operation

What is the second abstraction of complexity analysis?

Analysis of the running time for input size n for large n; ignores lower order terms and constant factors; written with asymptotic notation (omega theta, big O)

What is Big-Theta?

Asymptotic tight bound;

What is Big-O notation?

Asymptotic upper bound; cannot compare algs in same complexity class; gives sensible comparisons for large ns and differen't complexity classes

Selection Sort Anaylysis

Best case, worst case, avg case are all O(n^2)

Open Addressing

Collision resolution method where all elements are stored in the hash table itself; when collisions occur, use a systematic procedure to store elements in free slots on the table

Chaining

Collision resolution method where you store all elements that hash to the same slot in a linked list; store a pointer to the head of the linked list in the hash table slot

Binary Search

First look at the middle element of the array. If not equal to the target element, the target is either in the left sub-array or the right sub-array. Divide the array in half and repeat the search in the half that contains the target; Worst case O(log(n)), average case O(log(n)), best case O(1)

What other names is the greatest common divisor known by?

Greatest common factor, highest common factor, greatest common measure, highest common divisor

Substitution method

Guess the form of the solution then use mathematical induction to show it's correct, Use recursion trees

Why are the bases of logarithms ignored in asymptotic notation?

If a base of a a log is changed from one constant to another, the value is altered by only a constant.

Merge Sort

Invented by John Von Neumann in 1945; it is a divide and conquer algorithm; Divide the n-element sequence to be sorted into two subsequences of n/2 elements each, Sort the two subsequences recursively using merge sort, merge the two sorted subsequences to product the sorted answer

Grade-school "long" multiplication

Involves the multiplications of an n-digit number by a single digit, plus the addition of n numbers which have at most 2n digits. Complexity O(n^2)

Partitioning

Key step of quicksort algorithm; given the selected pivot, partition the remaning elements into two smaller sets

Insertion Sort

Like sorting a hand of playing cards; start with an empty left hand and the cards face down on the table, remove one card at a time from the table and insert it into the correct position in the left hand, compare it with each of the cards already in the hand from right to left, the cards held in the left hand are sorted

What is the origin of the word 'algorithm'?

Muhammed ibn Musa al-Khwarizmi (how the fk) wrote a book about mathematical processes, then centuries later Europeans called them 'algorithms' due to the latin form of his name (oh okay) 'algorithmi'

Merge Sort Analysis

Running time T(n) of Merge sort: Divide: computing the middle takes bigTheta(1), Conquer: solving 2 subproblems takes 2T(n/2), combine: merging n elements takes bigTheta(n)....Total: T(1) = bigTheta(1) and T(n) = 2T(n/2)+bigTheta(n)

Who is John Napier?

Scottish mathematician, physicist, astronomer, theologian; invented logarithms popularized decimal points; thought the pope was the antichrist, world was supposed to end in 1688 or 1700 (lol oops)

Lattice Method of Multiplication

Set up the grid, fill in products, add diagonally right to left and carry as necessary to the next diagonal. Complexity of O(n^2)

How are recurrence relations solved?

Substitution method or master theorem

What is the general form of a recurrence relation?

T(n) = bigTheta(1) if n<=c, T(n) = aT(n/b)+f(n), where a is the number of recursive calls, n/b is the size of the input for a recursive call, and f(n) is the extra work for dividing a problem before the recursive calls and then combining the solutions to the subproblems after return from the recursion

Hashing

Technique that where the average case access time can be O(1); Basic Idea: Items are stores in an array of size N, the preferred position in the array is computed using a hash function of the item's key; when adding an item, if the preferred position is occupied, a collision resolution method is used to find another position to store it; Search will cost bigTheta(n) time in the worst case, however all operations can be made to have an expected complexity of bigTheta(1)

What are the two parts of a recursive definition?

The base case: a stopping condition; the recursive step: an expression of the computation or definition in terms of itself

Why are the bases of exponentials NOT ignored in asymptotic notation?

The bases differ by an exponential factor, not a constant factor

What is a logarithm?

The common (base-10) logarithm of a number is the power to which 10 must be raised to give the number; defined for ALL positive numbers, not for negative numbers;

When do logarithms occur?

The data set keeps getting divided by 2

What is a turing machine?

Theoretical generalized computer; finite number of states (like the brain); sensor could read/write/erase symbols onto tape

How are logarithms used in computer science?

We use base 2, 'cause we always work with binary; log2n tells us how many bits we need to represent n possibilities

Rehashing

When the load factor exceeds a threshold, double the table size (smallest prime > 2 * old table size); rehash each record in the old table into the new table

Grade- school addition in binary

You can add two binary numbers one column at a time starting from the right, just as you add two decimal numbers; complexity is O(n). Addition is much easier than multiplication in terms of algorithmic complexity

Recurrence Relations

arise when analyzing the running time of recursive algorithms, especially divide-and-conquer algorithms

Why do we need the recursive step in recursive functions?

execution must "drive" computation to a base case so the recursion will stop. The recursive step is intended to ensure that the computation eventually terminates

How is the greatest common divisor notated for two integers a and b?

gcd(a,b) or (a,b)

When can you not use the master theorem?

if T(n) is not monotone, f(n) is not a polynomial, b cannot be expressed as a constant

Gauss' Optimization

input a,b,c,d; output ac, bd, ad+bc; ad+bc=(a+b)(c+d)-ac-bd

What is the relationship of logs to exponentials?

logxY = z can be written as x^Z = Y; the log is the exponent

When does the best case occur for quicksort?

when every call to partition results in the most balanced partition

When does the worst case occur for quicksort?

when every call to partition results in the most unbalanced partition


Conjuntos de estudio relacionados

Foundations for Living: Doctrines of the Bible

View Set

Accounting Chapter 10: Reporting and Analyzing Liabilities

View Set

(Test and Grades) (Reliability and Validity) (Descriptive Statistics) - Chapter 14

View Set

Chapter 27: Lower Respiratory Problems

View Set

HESI EAQ QUIZ High Risk Pregnancy

View Set

Networking - Chapter 2: Networking Basics

View Set

Biology - F214 - 4 - Structure and function of mitochondria

View Set

філософія екзамен

View Set