This is an old revision of the document!


Section 2.1 - Computational Tractability

To begin with, we will focus on analyzing the worst-case running time: we will look for a bound on the largest possible running time the algorithm could have over all inputs of a given size N and see how this scales with N. (We don't use average-case analysis for a few reasons. See pg 31.)

We eventually arrive at a definition of efficiency: An algorithm is efficient if it has a polynomial running time.

The growth rates of polynomial functions and exponential functions is hugely different as well. (See table on page 34.)One of the benefits to making that our definition is that it becomes negatable and easier to express that there is no efficient algorithm for a particular problem.

I became more confused about this topic after reading this section and only found it a 7 on an interest scale. I think that it was more clear in class when we could work things out on the board and concepts were explained in easier-to-understand (more conversational?) terms.

Section 2.2 - Asymptotic Order of Growth

Our discussion of computational tractability has turned out to be intrinsically based on our ability to express the notion that an algorithm's worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). The function f(n) then becomes a bound on the running time of the algorithm.

When we provide a bound on the running time of an algorithm, we will generally be counting the number of such pseudo-code steps that are executed; in this context, one step will consist of assigning a value to a variable, looking up an entry in an array, following a pointer, or performing an arithmetic operation on a fixed-size integer.

We actually don't want to find super concrete statements about the running time of an algorithm on inputs of size n, because 1) it would be a lot of work 2)that makes it harder to do comparisons of for similar behavior in algorithms and 3) the detailed statements are just kind of meaningless.

Asymptotic Upper Bounds Let T(n) be a function (worst-case running time of a certain algorithm on an input of size n). Given another function f(n), we say that T(n) = O(f(n)). More precisely, T(n) is O(f(n)) if there exist constants c > 0 and n0 >=0 so that for all n >= n0, we have T(n) ⇐ c * f(n). In this case, we will say that T is asymptotically uppperbounded by f.

Asymptotic Lower Bounds We want to show that for arbitrarily large input sizes n, T(n) is at least a constant multiple of some specific function f(n). So, we say that T(n)/ is Ω(f(n)) if there exist constants ∈ > 0 and n0>=0 so that for all n>=n0, we have T(n) >= ∈ * f(n). We will refer to T in this case as being asymptotically lowerbounded by f.

Asymptotic Tight Bounds If we can show that a running time T(n) is both O(f(n)) and Ω(f(n)), then we've found the “right bound” and we can say that T(n) is Θ(f(n)). Asymptotically tight bounds on worst-case running times are nice things to find, since they characterize the worst-case performance of an algorithm precisely up to constant factors. (Unsure why this formatting is occurring.)

Properties of Asymptotic Growth Rates

A property is transitivity: if a function f is asymptotically upper-bounded by a function g, and if g is asymptotically upper-bounded by a function h then f is asymptotically upperbounded by h. A similar property holds for lower bounds.

It is also useful to have results that quantify the effect of adding two functions. If we have an asymptotic upper bound that appleis to each of two functions f and g, then it applies to their sum.

Asymptotic Bounds for Some Common Functions

Polynomials The asymptotic rate of growth for polynomials is determined by their “high-order term” - the one that determines the degree.

  • Let f be a polynomial of degree d, in which the coefficient a_d is positive. Then f = O(n^d).

A polynomial-time algorithm is one whose running time T(n) is O(n^d) for some constant d where d is independent of the input size.

Logarithms For every base b, the function log_b_n is asymptotically bounded by every function of the form n^x, even for values of x arbitrary close to 0. The base is not important when writing bounds using asymptotic notation.

Exponentials For every r > 1 and every d >0, we have n^d = O(r^n). Asymptotically speaking, exponential functions are all different. Still, it's usually clear that people mean that the running time grows at least as fast as some exponential function.

After reading this section of the textbook, I had to go look at the slides from class to get it all straight but when combined with the notes from class, I would give this a 8.

Section 2.3 -

Section 2.4 - A Survey of Common Running Times

Linear Time An algorithm that runs in O(n), or linear, time has a very natural property: its running time is at most a constant factor times the size of the input. One basic way to get an algorithm with this running time is to process the input in a single pass, spending a constant amount of time on each item of input encountered. Two example linear-time algorithms are computing the maximum and merging two sorted lists.

Cubic Time Elaborate sets of nested loops often lead to algorithms that run in O(n^3) time. Example, seeing if two sets have an element in common (see page 52).

O(n^k) We obtain a running time of O(n^k) for any constant k when we search over all subsets of size k. Example, finding independent sets in a graph (page 53). (It is believed that no algorithm to find k-node independent sets in arbitrary graphs can avoid having some dependence on k in the exponent.

Beyond Polynomial Time Two kinds of bounds that come up very frequently are 2^n and n!. For example, we get 2^n as a running time for a search algorithm that must consider all subsets. The function n! grows even more rapidly than 2^n. Search spaces of size n! tend to arise for one of two reasons - first, when n! is the number of ways to match up n items with n other items and second, when n! is all the ways to arrange n items in order.

Sublinear Time There are cases where one encounters running times that are asymptotically smaller than linear. Since it takes linear time just to read the input, these situations tend to arise in a model of computation where the input can be “queried' indirectly rather than read completely and the goal is to minimize the amount of querying that must be done. Example, binary search algorithm → running time of O(log n).

Section 2.5 - A More Complex Data Structure: Priority Queues

courses/cs211/winter2014/journals/deirdre/chapter2.1390289077.txt.gz · Last modified: 2014/01/21 07:24 by tobind
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0