Chapter 2.2: Asymptotic Order of Growth
If we can express an algorithm’s worst-case running time on inputs of size n as growing at a rate that is proportional to some function f(n), then that function becomes a bound on the running time of the algorithm. Counting the number of steps an algorithm executes is essentially meaningless both because we will often be using pseudo-code and because we want to classify broad classes of algorithms so we need a more general way of determining running time. So, we want to express the growth rate of running times in a way that is insensitive to constant factors and low-order terms.
T(n) – worst-case running time of a certain algorithm on an input of size n O(f(n))— asymptotic upper bound, the order of f(n) Ω(f(n)) – asymptotic lower bound θ(f(n))—asymptotically tight bound, when T(n) = O(f(n)) = Ω(f(n)) Some properties of asymptotic growth rates:
* Transitivity: if a function f is asymptotically upper-bounded by a function g and if g is in turn asymptotically upper-bounded by h, then f is asymptotically upper-bounded by h. This property holds similarly with lower bounds.
If f = O(g) and g = O(h), then f = O(h) If f = Ω(g) and g = Ω(h), then f = Ω(h)
* Sums of functions: if we have an asymptotic upper bound that applies to each of the two functions f and g, then it applies to their sum
If f = O(h) and g = O(h), then f + g = O(h) Let k be a fixed constant and let f1,f2,…,fk and h be functions such that fi = O(h) for all i. Then f1 + f2 +…+ fk = O(h). If f and g are two functions (taking nonnegative values) such that g = O(f). The f + g = θ(f), i.e. f is an asymptotically tight bound for the combined function f + g.
This was definitely the most difficult part of the reading for me. Maybe it is because I missed Monday’s class. Hopefully hearing this again in class will clear things up for me because I was completely lost in all of the variables and their relationships to one another. I’d rate this section a 3.