This is an old revision of the document!
Chapter 2
2.1-2.2
Discrete: involving an implicit search over a large set of combinatorial possibilities
How do we measure runtime?: using the worst case runtime
- Why not avg runtime? –> what is average?
- When looking at polynomial runtimes, we only really need to pay attention to the highest-order term
An algorithm is efficient if it has a polynomial running time
Desirable Scaling Property: when the input size doubles, the algorithm should only slow down by some constant factor C (from ppt)
Runtimes in order from slowest to fastest:
- n
- nlog2n (the minimum runtime for comparison-based sorting algorithms)
- n^2
- n^3
- 1.5^n
- 2^n
- n!
Asymptotic Defs
Asymptotic Upper Bounds: We say T(n) = O(f(n)) (“T(n) is order f(n)”) if, for a sufficiently large n, the function T(n) is bounded above by a constant multiple of f(n). Basically, if eventually f(n) stays greater than T(n) forever after a certain point, “T is asymptotically upper-bounded by f”
Asymptotic Lower Bounds: we say T(n) is Ω(f(n)) (“T is asymptotically lower bounded by f”) in the case opposite of the asymptotic lower bounds (f(n) < T(n) forever after a certain point).
Asymptotically Tight Bounds: if T(n) is both O(f(n)) and Ω(f(n)), we say that T(n) is Θ(f(n))
Headline
Review: Gale-Shapley Stable matching algorithm has at most n^2 iterations –> worst-case runtime = O(n^2)
