This is an old revision of the document!


Chapter 2: Basics of Algorithm Analysis

2.1 Computational Tractability

Several factors are key in the design of algorithms, including understanding the discrete nature of many problems as well as addressing the efficiency and space of an algorithm. The text gives one definition of efficiency: “An algorithm is efficient if, when implemented, it runs quickly on real input instances.” Unfortunately, this definition fails to consider where and how well the algorithm is implemented.

To be more specific, we can argue that an algorithm is efficient if it, at the bare minimum, performs in the worst case better than would a brute-force search. To narrow the definition even further, we could classify an 'efficient algorithm' as one with a polynomial running time, meaning its running time is bounded by cNd where c and d are constants > 0 and every input instance is size N.

2.2 Asymptotic Order of Growth

For the purposes of simplicity and clarity, classifying running times should be done in terms of constant factors; for example instead of calculating a run time as an2 + bn + c, we would consider this run time to be simply n2. Assume T(n) is the worst-case run time for an algorithm with n referring to the size of the input. Let f(n) refer to a function. T is asymptotically upper-bounded and T(n) is order f(n) if:

  • there exists a constant c > 0
  • there exists and a constant n0 ≥ 0 such that

all n ≥ n0, and

  • T(n)c.
courses/cs211/winter2018/journals/devlinn/chapter2.1515540159.txt.gz · Last modified: by devlinn
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0