Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
courses:cs211:winter2011:journals:camille:chapter2 [2011/01/18 17:38] – created cobbc | courses:cs211:winter2011:journals:camille:chapter2 [2011/01/31 02:47] (current) – [Section 1: Computational Tractability] cobbc | ||
---|---|---|---|
Line 12: | Line 12: | ||
Again, this section was very readable. I thought that they did a great job of explaining why we need the mathematical definition of efficiency and how we get to it. And I read this after our classes on t, and I think that was good for me, because I already knew what to expect. | Again, this section was very readable. I thought that they did a great job of explaining why we need the mathematical definition of efficiency and how we get to it. And I read this after our classes on t, and I think that was good for me, because I already knew what to expect. | ||
+ | * **Sara Says:** | ||
+ | "Wait I thought our algorithm for the stable matching problem took O(N^2) time?" --> Yes, | ||
+ | that's correct. | ||
+ | pairings (N!), find one that is stable and that's the solution." | ||
+ | Gale-Shapley is a more efficient algorithm. | ||
===== Section 2: Asymptotic Order of Growth ===== | ===== Section 2: Asymptotic Order of Growth ===== | ||
Line 23: | Line 28: | ||
This section was pretty read-able, but I just skimmed over the proofs (I might revisit them later), because I felt that I had a good understanding of them from what we've discussed in class. Obviously, that kind of math is harder to read in general, but I think they' | This section was pretty read-able, but I just skimmed over the proofs (I might revisit them later), because I felt that I had a good understanding of them from what we've discussed in class. Obviously, that kind of math is harder to read in general, but I think they' | ||
- | ===== Section 3: Implementing the Stable Matching Algorithm ===== | + | ===== Section 3: Implementing the Stable Matching Algorithm |
* **Summary: | * **Summary: | ||
+ | You don't have to actually program to be able to analyze the running time of algorithms, but you do need to be able to think about what data structures you would/could use (and how they behave). Some data structures will be better than others for different algorithms. Data structure choice is up to the designer! Sometimes it's important to consider the preprocessing of input. Arrays vs. Lists -> The book goes into the running times of various list/array operations. That information (in a more compact form) is also in our class notes. Important note: easy to convert between lists and arrays (O(n) time), so they can be used together! For the G-S Algorithm for the stable matching problem, we said earlier that it should complete in O(n^2) time, BUT in order for this to work, every step has to complete in constant time. The book goes into how to do this (i.e. what data structures to use to represent what and why). We did that pretty extensively in class so see notes or p. 45-47 for that info. | ||
* **Insights and Questions: | * **Insights and Questions: | ||
+ | Linked lists to represent the mens' preference lists/ | ||
+ | Is it always possible to make every " | ||
* **Readability: | * **Readability: | ||
+ | Awesome! It made a great review of what we did in class! | ||
===== Section 4: A Survey of Common Running Times ===== | ===== Section 4: A Survey of Common Running Times ===== | ||
* **Summary: | * **Summary: | ||
+ | Particular styles of approaches to a problem correspond/ | ||
+ | **Linear (O(n)) Time:** Comes from " | ||
+ | |||
+ | **O(nlogn) Time:** You get O(nlogn) time for algorithms that split the input into two equal-sized pieces, solve each piece recursively, | ||
+ | |||
+ | **Quadratic Time:** Quadratic time comes from a pair of nested loops. The outside loop runs O(n) times, each time starting another loop that runs in O(n) time. The total running time is O(n^2). Ex. finding the closest pair of points in a plane using a brute-force algorithm. You need to find all of the pairs of points (O(n^2) time) and compute the distance between them (constant time). That's not the best way to do that problem, though. Later the book will explain how to do it in O(nlogn) and even O(n) time. | ||
+ | |||
+ | **Cubic Time:** More elaborate sets of nested loops can lead to O(n^3) time. Ex. given a bunch of subsets of a set, figuring out if any pair of those subsets is disjoint takes O(n^3). The algorithm for that is on p. 52-53. | ||
+ | |||
+ | **O(n^k) Time:** It takes O(n^k) time to find k-element subsets of a set of n items (just like it takes O(n^2) time to find pairs). The book goes into an an example of finding an independent set of size k in a graph. Finding all of the k-element subsets takes O(n^k) time, and processing each of those subsets (i.e. checking each pair of nodes within the subset to see if there' | ||
+ | |||
+ | **Beyond Polynomial Time:** Common running times that increase faster than polynomial are 2^n and n!. Ex. finding an independent set of maximum size in a graph - you not only have the O(n^k) time, you have to look at subsets of ALL possible lengths, so that ends up taking O(n^2 * 2^n) time! Even worse are n! running times. n! search spaces arise in 2 ways: (1) It's the number of ways to match n items with n other items (ex. the possible perfect matchings in our stable matching problem) and (2) It's the number of ways to arrange n items in order (ex. the traveling salesman problem - what order should he visit the cities in? --> believed to have no efficient solution). | ||
+ | |||
+ | **Sublinear Time:** Problems that take less time than linear. You don't have to look through all of the input, just need to " | ||
* **Insights and Questions: | * **Insights and Questions: | ||
+ | I am really interested to see how some of these problems are solved in less time. Also, I am intrigued by the problems that can't be solved efficiently, | ||
* **Readability: | * **Readability: | ||
+ | I thought this section was great. The breakups into different asymptotic running times and the connections between the running times (ex. that the " | ||
===== Section 5: Priority Queues ===== | ===== Section 5: Priority Queues ===== | ||
* **Summary: | * **Summary: | ||
+ | Good algorithms are important to get efficient solutions, but sometimes you need more complex data structures in order to implement the algorithms efficiently (or make improvements to the algorithm). Some data structures are designed for specific problems, but some are more broadly applicable - like the priority queue, which is a data structure that "must perform some nontrivial processing each time it is invoked." | ||
* **Insights and Questions: | * **Insights and Questions: | ||
+ | Interesting distinction between arrays/ | ||
* **Readability: | * **Readability: | ||
+ | This section was a little harder to read than others. It flowed well, but the logic of jumping from arrays to priority queues to heaps wasn't quite as clear or as seamless as other sections or as clear as it was in class. I think that actually seeing someone do heapify-up/ |