===== Week 1 ===== ==== Chapter 0: Preface ==== Algorithms are everywhere. By using algorithms, we can not only find solutions to difficult problems but also do so efficiently and prove that our findings are accurate. If we work hard, we can find solutions to problems as //difficult// as the Stable Matching problem, as //culturally-welcome// as the same-sex Stable Matching problem((The footnote on page 3 of the textbook mentions that Gale and Shapely also considered a same-sex case of the Stable Marriage Problem that is very different from the heterosexual Stable Marriage Problem. [[http://philosophyforprogrammers.blogspot.com/2011/03/gay-marriage-computer-science.html|This article]] from the "Pholosophy for Programmers" blog is a good read because it addresses the merit of such an algorithm while providing an interesting perspective on the subject.)), and as //easy// as the Traveling Salesman Problem((Note that the //italicized// words are sarcastically opposite of what they actually are. :-) )) Algorithms help us to take a problem, break it down to its essential components, solve that problem and prove it with advanced logic, and then apply that algorithm to real life problems. ==== Chapter 1: Introduction: Some Representative Problems ==== === The Stable Matching Problem === Gale and Shapeley developed an algorithm to solve the Stable Matching Problem, typically referred to as the **Stable Marriage Problem**. The problem states that given ''n'' men and ''n'' women and a list for each man's and woman's preference list of preferred marriage partners, pair up all of the men and women together such that no unstable matchings occur. An //unstable matching// is any occasion when a man, ''m1'' is paired with a woman ''w1'' and a man ''m2'' is paired with a woman ''w2'' but ''m1'' and ''w2'' would prefer to be married to each other over their current paired partners. The algorithm consists of having each free man propose to the first woman on his preference list that he hasn't proposed to yet. If the woman is not engaged, they will become engaged. If the woman is engaged but prefers the new man to her current partner (based on her preference list), she will dramatically break off the engagement and run off to elope with the new dream man (at least, until a new Prince Charming comes along). === Five Representative Problems === As we continue to learn about Algorithms, we will come across five main problem themes: - Interval Scheduling Problem - Weighted Interval Scheduling Problem - Bipartite Matching Problem - Independent Set Problem - Competitive Facility Location Problem The **Interval Scheduling Problem** is about choosing which ones of potentially many schedule requests to confirm for a given set of time such that as many schedule requests are fulfilled as possible. The **Weighted Interval Scheduling Problem** is just like the //Interval Scheduling Problem// except that each schedule request has a "weight" or priority attributed to it and the goal is to find a schedule that contains the most weight possible. The **Bipartite Matching Problem** is a type of problem where a relationship needs to be established between some or all of the elements in two distinct sets, such as the //Stable Marriage Problem//. The **Independent Set Problem** is that given a //graph// of vertices and their respective edges, find the largest subset of vertices in which no two vertices are directly connected by an edge. The **Competitive Facility Location Problem** is a problem where two competing sets are trying to claim weighted nodes on a graph that are not adjacent to nodes already claimed by the other set. ==== Chapter 2: Basics of Algorithm Analysis ==== === Computational Tractability === It is difficult to define efficiency, much less determine if a given algorithm is efficient or not. It may be tempting to just compare an algorithm's running time with its equivalent brute-force algorithm, but how bad should the brute-force algorithm be and what kind of scale do you use when you say it's "this much better than the brute-force algorithm"? The worst-case running times for an algorithm are useful, but those times are still dependent on the type and size of the problem. To determine if an algorithm is efficient, we must compare it to a consistent base case or common characteristic. That base case is a polynomial-time algorithm in which if the size of a problem doubles, the increase in computation needed to run such an algorithm is constant. === Asymptotic Order of Growth === If we are analyzing the time behavior of an algorithm, it would be nice if we could provide some useful general characteristics about it that could be used to relate to other algorithms with a similar time behavior. A good way to do this is to put bounds on the time characteristic of the algorithm, being an upper bound and a lower bound of the algorithm's time complexity. The upper bound of an algorithm's time complexity is notated as ''O(f(n))'' and is known as **"Big-O" notation**. Determining the "Big-O" function of an algorithm means that for any problem size the algorithm will not take more computation than the given function plus a constant as the size of the problem grows to infinity. Similarly, the lower bound of an algorithm's time complexity is notated as ''Ω(f(n))'' and is known as **"Big-Omega" notation**. "Big-Omega" notation is useful because it is indicative of the baseline "best-case" scenario of an algorithm of any problem size. ~~DISCUSSION~~