Ch. 10 Measuring the Efficiency of Algorithms
Three difficulties with comparing programs instead of algorithms
How are the algorithms coded? What computer should you use? What data should the programs use?
Average-case analysis
A determination of the average amount of time that an algorithm requires to solve problems of size n
Worst-case analysis
A determination of the maximum amount of time that an algorithm requires to solve problems of size n
Growth-rate function
A mathematical function used to specify an algorithm's order in terms of the size of the problem
Big O notation
A notation that uses the capital letter O to specify an algorithm's order Example: O(f(n))
Definition of the order of an algorithm
Algorithm A is order f(n) - denoted O(f(n)) - if constants k and n0 exist such that A requires no more than k * f(n) time units to solve a problem of size n ≥ n0
Counting an algorithm's operations is a way to access its efficiency
An algorithm's execution time is related to the number of operations it requires
An algorithm's growth rate
Enables the comparison of one algorithm with another
Order of growth of some common functions
O(1) < O(log2n) < O(n) < O(n * log2n) < O(n2) < O(n3) < O(2n)
Analysis of algorithms
Provides tools for contrasting the efficiency of different methods of solution
A comparison of algorithms
Should focus of significant differences in efficiency Should not consider reductions in computing costs due to clever coding tricks
Algorithm analysis should be independent of
Specific implementations Computers Data
Properties of growth-rate functions
You can ignore low-order terms You can ignore a multiplicative constant in the high-order term O(f(n)) + O(g(n)) = O(f(n) + g(n))
Algorithm efficiency is typically a concern for
large problems only
An algorithm's time requirements
requirements can be measured as a function of the problem size