Instead of counting the steps (number of instructions)in an algorithm, we shall find a more meaningful and tireless way of finding the running time of an algorithm. We shall then use bounds to understand generally the working time range of an algorithm. Asymptotic Upper bounds. T(n) is our function, f(n) is another given function. T(n) can be expressed as f(n) i.e T(n) is order of f(n) if for sufficiently large input size N, T(n) is bounded above by c.f(n) where c>0. This means T(n)⇐=c.f(n) (T is asymptotically upper bounded by f). Example, if T(n) = pn^2+qn+r, T(n) ⇐ (p+q+r)n^2 after inflating the terms. This can be written as T(n) ⇐ c.(n^2) where c > 0. But we could also yield T(n) ⇐ c.(n^3). With all the possible running times of T(n) after inflating the terms, we prefer to go with the “most tight” possible running time (“kinda like the lowest upper bound possible”). Thus from the above calculations, the most tight possible running time of T(n) is c.(n^2). We then can conclude that T(n) = O(n^2) where n^2 is our second function f(n). This what we call Big-O notation, O(n^2), T(n)'s worst-case running time.
Asymptotic Lower bounds. This works as similar as our Big-O notation. But in this case, T(n) >= d*f(n) where d is independent of n and greater than 0. And keep in mind that we are bounding T(n) from below this time. We now need to reduce the size of T(n) (Deflating the terms), T(n) = pn^2+qn+r >= pn^2 >= pn, is correct. But we go with pn^2 as he tighest running time possible. It is now correct to say T(n) >= d.(n^2), thus reaching a conclusion that T(n) = Ω(n^2). The lowest possible bound this algorithm could reach is n^2. If we can show that a running time T(n) is both O(f(n)) and Ω(f(n)), then in a natural sense we have found the right bound. We can then say that T(n) is Θ(f(n)). In the example used above T(n) is Θ(n^2). Asymptotically tight bounds on worst-case running times are nice to find since they characterize the worst-case performance of an algorithm precisely up to constant factors.
The scale of readability of 2.2 would be a 8/10.
These Asypmtotic Growth Rates have basic properties, and these include:
*Transitivity: If a function h asymptotically upper bounds a function g, which also asymptotically upper bounds a function f, the function h is also said to asymptotically upper-bound f. This also applies to lower bounds.
*Additivity: If an asymptotic upper bound applies to two functions, then it also applies to their sum.