2.2 Asymptotic Order of Growth

Asymptotic Upper Bounds

For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants,
For all n ≥ 1,
T(n) = pn2 + qn + r ⇒ T(n) ≤ pn2 + qn2+ rn2
pn2 + qn2+ rn2 = (p+q+r) n2 = c n2
⇒ T(n) ≤ cn2, where c = p+q+r
Thus, T(n) = O(n2)

Asymptotic Lower Bounds

For instance, if T(n) = pn2 + qn + r where p, q, r are positive constants

For all n ≥ 0, T(n) = pn2 + qn + r ≥ pn2
⇒ T(n) ≥ εn2, where ε = p > 0

⇒T(n) = Ω(n)

Asymptotic Tight Bounds

Properties of Asymptotic Growth Rates

Asymptotic Bounds for some Common Functions

These three functions are the most frequent functions encountered when analyzing running times. Logarithms grow more slowly than polynomials and polynomials grow more slowly than exponentials.

There are several motivations for studying the bounds of the running times of algorithms. Knowing the running time of an algorithm, one can easily perfect an algorithm as much he/she can. The running times serve as a measure of comparison when searching for which algorithm is more efficient for a given problem. Thus a programmer should always aim for the most efficient algorithm he/she can possibly design and be convinced it's more efficient than that he/she had previously implemented thanks to the knowledge of the running times of algorithms.

I don't have a specific question about this section as most of the material makes sense. After rereading this section and seeing how it was presented in class, I can now see the importance of tight bounds. Tight bounds are a new concept for me when it comes to running times of algorithms,but I was able to see why they are useful in analyzing the algorithm: They give approximate,more precise limits to the running time of the algorithms. I would like to remember the properties of the running times of algorithms, they look very useful and I'm convinced they can help a lot in analyzing running times of algorithms.

This section was interesting too, I give it a 9/10.