Computational complexity
random access machine (RAM)
a model of computation where steps are executed sequentially, and each step is an operation that takes a constant amount of time (assignment, comparison, arithmetic operation, accessing object in memory)
tight bound
an upper and lower bound on a asymptotic running time
average-case running time
average running time over all possible inputs of a given size
exponential complexity
complexity grows at the power of some number to n, most expensive type of algorithm
linear complexity
complexity grows linearly with the size of inputs, generally seen with iterating over lists, can depend on recursive calls
logarithmic complexity
complexity grows with log of size of one of its inputs
polynomial complexity
complexity grows with n to some power grows, seen in nested loops or particular recursive calls, most common is quatratic
worst-case running time
maximum running time over all possible inputs of a given size, provides an upper bound on the running time
best-case running time
minimum running time over all possible inputs of a given size
step
operation that takes a fixed amount of time
constant complexity
upper bound is independent of input, can have loops or recursive calls that do not depend on the size of input
Big O notation
used to give the upper bound of the asymptotic growth, or the order of the function
asymptotic notation
describes the complexity of an algorithm as the size of its inputs approaches infinity, like linear, quadratic and polynomial equations