Section 2.2: Asymptotic Order of Growth
Bounds of runtime functions are the “worst-case” scenarios that an algorithms runtime is bounded by. A bound can be an upper bound, which is denoted O(f(n)), a lower bound, which is denoted Ω(F(n)), or a tight bound, which is denoted Θ(f(n)). An upper bound occurs when the algorithm’s function is bounded above by a multiple of f(n). A lower bound occurs when the algorithm’s function is less greater than f(n) times some constant. Finally a tight bound occurs when the upper bound and lower bound are the same. The section traced through some examples that solidified these definitions. Asymptotic bounds have many properties including transitivity and additivity. Transitivity means that if f = O(g) and g = O(h), then f = O(h). Another properties of bounds is that if f and g are two functions such that g = O(f) then f + g = Θ(f). Finally the section discussed polynomial (n raised to a power), logarithmic, and exponential (a constant raised to a power of n) bounds. It outlined how to develop these different algorithms as well as needed notation and conversions. The exponents do not need to be whole numbers and the bases of logarithmic functions are not important. These three types of bounds provide solid benchmarks in ranges of functions.
The motivation for looking at asymptotic bounds is to have a concrete way of describing the worst-case scenario. It allows us to narrow our prediction of how quickly an algorithm will run and gives us a close range as opposed to a bunch of possibilities. We can also see how the runtimes of programs will be affected by adding different algorithms together.
This section was a lot of notation and it is a little confusing. I think going through some real life examples will help to clarify as thinking about it abstractly is difficult.
I would give this section an 7 out of 10 as it was a little dry to read and was just a lot of notation.