Asymptotic Complexity
average case vs worst case
There is an additional wrinkle: even at a given problem size, the performance of an algorithm may differ widely for different inputs. For example, some sorting algorithms run faster when the array is already sorted. To account for this variability, we usually characterize the worst-case performance of the algorithm on inputs of a given size, though sometimes, average-case performance is a more important performance measure.
Big O notation
We write expressions like O(n) and O(n2) to describe the performance of algorithms in terms of asymptotic complexity. This notation is called "big-O" notation. It describes performance in a way that is largely independent of the kind of computer on which we are running our code. It is "asymptotic" in the sense that it describes how performance behaves as we increase n to large values. It places an upper bound on the amount of time taken by an algorithm, while ignoring constant factors.
O(n) is said to be
aymptotically linear
for big arrays linear or binary search has better perfomane
binary search
O(1) is said to be
constant time because it always less than some constant k
for small arrays linear or binry seach has better performance
liner search
performance of recursive implementation of binary search that finds an integer in a sorted array of integers:
logarithmic performance O(log n)
O(n^2)
quadratic
time complexity space complexity
time complexitty: The time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the input. Note that the time to run is a function of the length of the input and not the actual execution time of the machine on which the algorithm is running on. The space Complexity of an algorithm is the total space taken by the algorithm with respect to the input size. Space complexity includes both Auxiliary space and space used by input.
Big O notation only establishes an____ on how the function behaves
upper bound