Differences
This shows you the differences between two versions of the page.
courses:cs211:winter2012:journals:ian:chapter2 [2012/01/18 04:37] – created sturdyi | courses:cs211:winter2012:journals:ian:chapter2 [2012/01/24 21:54] (current) – sturdyi | ||
---|---|---|---|
Line 13: | Line 13: | ||
* No questions. | * No questions. | ||
* Well done; I think that the level of proof/ | * Well done; I think that the level of proof/ | ||
+ | |||
+ | ===== 2.3 Implementing the Stable Matching Algorithm ===== | ||
+ | |||
+ | * Computational complexity of an algorithm depends on the data structures; the analytical complexity can only be completed if every step is constant complexity (and meeting this goal may require preprocessing, | ||
+ | * No insights. | ||
+ | * I am still somewhat bothered by the suggestion of O(1) alteration of lists; it should be O(i), like the lookup, even for a mutable list. That said, for Stable Matching we are only using lists as queues. | ||
+ | * I like the combination of the introduction of the data structures with putting them to use, but doing all the description of the structures first somewhat ruins the effect. Well-written anyway. 8/10. | ||
+ | |||
+ | ===== A Survey of Common Running Times ===== | ||
+ | |||
+ | * Knowing common runtimes is important, both for knowing what to expect to be able to pare an algorithm to and knowing the complexity of various operations within an operation. All operation requiring some processing of each part of the input is at least linear; things that can be done in a single pass (of constant-time steps) are tightly linear. Divide-and-recurse algorithms often come to O(n log n), so it is common for sorts. Polynomial times with higher exponents often arise from looking at sets of sets, or all subsets of a set of n elements. Exponential time arises typically from brute-force algorithms trying all possibilities, | ||
+ | * No Insights. | ||
+ | * No Questions. | ||
+ | * Fairly clear, although going by increasing runtime and then concluding with sublinear was rather jarring (knowing that there are sublinear algorithms, I spent most of the section wondering when I would see them. 7/10. | ||
+ | |||
+ | ===== Priority Queues ===== | ||
+ | * A common problem is keeping a set of things and retrieving the maximum (or minimum), but without random access. A heap is a structure of nodes, each branching to 0-2 other nodes; it is in heap order if each node has key at least as large as its parent). They can be intuitively implemented as trees, but also as arrays if the size is bounded. Since balanced trees have depth log n, most heap operations have complexity O(log n). | ||
+ | * I am still bothered by the approach to deleting the root by sending the larger child to the bottom, when it is quite likely that it will need to be moved back near the top, although I cannot improve on it. | ||
+ | * What is the running time on an operation that recursively operates on each node of a tree (and then merges results in constant time)? It seems that it should be O(n), since there are n nodes and it must hit each once, but it also bears some similarity to the O(n log n) recursive sorts (such as mergesort). | ||
+ | * Well-written. 8/10. | ||
+ |