This is an old revision of the document!
Section 2.1 - Computational Tractability
To begin with, we will focus on analyzing the worst-case running time: we will look for a bound on the largest possible running time the algorithm could have over all inputs of a given size N and see how this scales with N. (We don't use average-case analysis for a few reasons. See pg 31.)
We eventually arrive at a definition of efficiency: An algorithm is efficient if it has a polynomial running time.
The growth rates of polynomial functions and exponential functions is hugely different as well. (See table on page 34.)One of the benefits to making that our definition is that it becomes negatable and easier to express that there is no efficient algorithm for a particular problem.
I became more confused about this topic after reading this section and only found it a 7.
Section 2.2 - Asymptotic Order of Growth
Our discussion of computational tractability has turned out to be intrinsically based on our ability to express the notion that an algorithm's worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). The function f(n) then becomes a bound on the running time of the algorithm.
When we provide a bound on the running time of an algorithm, we will generally be counting the number of such pseudo-code steps that are executed; in this context, one step will consist of assigning a value to a variable, looking up an entry in an array, following a pointer, or performing an arithmetic operation on a fixed-size integer.
We actually don't want to find super concrete statements about the running time of an algorithm on inputs of size n, because 1) it would be a lot of work 2)that makes it harder to do comparisons of for similar behavior in algorithms and 3) the detailed statements are just kind of meaningless.
Asymptotic Upper Bounds Let T(n) be a function (worst-case running time of a certain algorithm on an input of size n). Given another function f(n), we say that T(n) = O(f(n)). More precisely, T(n) is O(f(n)) if there exist constants c > 0 and n0 >=0 so that for all n >= n0, we have T(n) ⇐ c * f(n). In this case, we will say that T is asymptotically uppperbounded by f.
Asymptotic Lower Bounds We want to show that for arbitrarily large input sizes n, T(n) is at least a constant multiple of some specific function f(n). So, we say that T(n)/ is Ω(f(n)) if there exist constants ∈ > 0 and n0>=0 so that for all n>=n0, we have T(n) >= ∈ * f(n). We will refer to T in this case as being asymptotically lowerbounded by f.
Asymptotic Tight Bounds If we can show that a running time T(n) is both O(f(n)) and Ω(f(n)), then we've found the “right bound” and we can say that T(n) is Θ(f(n)). Asymptotically tight bounds on worst-case running times are nice things to find, since they characterize the worst-case performance of an algorithm precisely up to constant factors. (Unsure why this formatting is occurring.)
Properties of Asymptotic Growth Rates
A property is transitivity: if a function f is asymptotically upper-bounded by a function g, and if g is asymptotically upper-bounded by a function h then f is asymptotically upperbounded by h. A similar property holds for lower bounds.
It is also useful to have results that quantify the effect of adding two functions. If we have an asymptotic upper bound that appleis to each of two functions f and g, then it applies to their sum.
Asymptotic Bounds for Some Common Functions
Polynomials The asymptotic rate of growth for polynomials is determined by their “high-order term” - the one that determines the degree.
- Let f be a polynomial of degree d, in which the coefficient a_d is positive. Then f = O(n^d).
A polynomial-time algorithm is one whose running time T(n) is O(n^d) for some constant d where d is independent of the input size.
Logarithms For every base b, the function log_b_n is asymptotically bounded by every function of the form n^x, even for values of x arbitrary close to 0. The base is not important when writing bounds using asymptotic notation.
Exponentials For every r > 1 and every d >0, we have n^d = O(r^n). Asymptotically speaking, exponential functions are all different. Still, it's usually clear that people mean that the running time grows at least as fast as some exponential function.
After reading this section of the textbook, I had to go look at the slides from class to get it all straight but when combined with the notes from class, I would give this a 8.