This is an old revision of the document!


Chapter 2

Chapter 2.1 (Computational Tractability)

What specific approach to efficient algorithms should we take?

  • Identify broad themes and design principles which we can apply to a variety of problems.\
  • Learn to apply these themes and principles to actual problems that vary our implementations subtlety.
  • We should learn the distinction between different combinatorial approaches as to improve efficiency whether it's storage, computational speed, etc.

What is an efficient algorithm?

  • An algorithm is efficient if, when implemented, it runs quickly on real input instances.
  • This definition misses some points though like where and how well we implement the algorithm, what a real input instance is, and how does it scale.

Worst-Case Running Times and Brute-Force Search

  • Slowest possible run-time of an algorithm given size N.
  • Useful because it's often hard to quantify the average-case scenario or input for an algorithm.
  • To tell if a worst-case runtime is efficient, compare it to a brute-force method.
  • Ex. brute forcing the stable matching algorithm is O(n!) while the worst-case scenario is only O(n^2).

New definition for efficient algorithm:

  • An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search.

Polynomial Time Definition

  • A “reasonable” running time is when an input * size is scaled by a constant factor, the algorithm will only slow down by a constant factor.
  • An algorithm is efficient if it has a polynomial running time.
  • While this is sometimes vague or fails for some extreme examples, it's better than the first two definitions.
  • This definition is good because some questions just don't have efficient solutions while others do.

Chapter 2.2 (Asymptotic Order of Growth)

Runtime Bounds

  • No need to get extremely specific in them as that is an exhaustive effort and unnecessary (unless you are under oppressively fine constraints for an algorithm's implementation).
  • By using a certain level of vagueness, similarities between algorithms start to appear.

Asymptotic Upper Bounds

  • An upper bound exists if the worst-case running time is less than a constant multiple of f(n).

Asymptotic Lower Bounds

  • A lower bound exists if the best-case running time is more than a constant multiple of f(n) for a sufficiently large size N. (necessary to prove scaleability, not just for tiny insignificant cases).

Asymptotic Tight Bounds

  • If we can show both an asymptotic upper and lower bound are the same as the run time then we can say we have found the right bounds.
  • These tight bounds are good to find for worst-case runtime scenarios as they show the runtime precisely.

Properties of Asymptotic Growth Rates

  • Transitivity: if a function F is upper bounded by another function H which itself is upper bounded by another function G then we can say that function F is is bounded by function G.
  • Sums of Functions: if an upper bound applies to both function F and function G individually then it applies to the sum of the functions F and G.

Asymptotic Bounds for Some Common Functions

  • Polynomials
    • If a polynomial is to degree D then the runtime is O(n^D).
    • A polynomial-time algorithm is one whose running time T(n) is O(n^D) for some constant D where D is independent of the input size.
  • Logarithms
    • Logarithms are slow growing functions. No matter the base of the logarithm in O(log n), it won't change much.
  • Exponentials
    • Exponentials have fast rates of growth.
    • For every r > 1 and every d > 0, we have n^d = O(r^n).
    • Don't just say a function is exponential (like people do with log).
courses/cs211/winter2018/journals/bowmang/chapter2.1516312968.txt.gz · Last modified: by admin
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0