====== Chapter 2 ====== ===== 2.1-2.2 ===== __Discrete__: involving an implicit search over a large set of combinatorial possibilities How do we measure runtime?: using the __worst case runtime__ * Why not avg runtime? --> what is average? * When looking at polynomial runtimes, we only really need to pay attention to the highest-order term **An algorithm is __efficient__ if it has a polynomial running time** __Desirable Scaling Property__: when the input size doubles, the algorithm should only slow down by some constant factor C (from ppt) Runtimes in order from slowest to fastest: * n * nlog2n (the minimum runtime for comparison-based sorting algorithms) * n^2 * n^3 * 1.5^n * 2^n * n! **__Asymptotic Defs__** __Asymptotic Upper Bounds__: We say T(n) = O(f(n)) (“T(n) is order f(n)”) if, for a sufficiently large n, the function T(n) is bounded above by a constant multiple of f(n). Basically, if eventually f(n) stays greater than T(n) forever after a certain point, “T is asymptotically upper-bounded by f” __Asymptotic Lower Bounds__: we say T(n) is Ω(f(n)) (“T is asymptotically lower bounded by f”) in the case opposite of the asymptotic lower bounds (f(n) < T(n) forever after a certain point). __Asymptotically Tight Bounds__: if T(n) is both O(f(n)) and Ω(f(n)), we say that T(n) is Θ(f(n)) ===== 2.3: Using Lists and Arrays for Stable Matching ===== Review: Gale-Shapley Stable matching algorithm has at most n^2 iterations --> worst-case runtime = O(n^2)