Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:patelk:chapter2 [2018/01/19 18:07] – [2.4 A Survey of Common Running Times] patelkcourses:cs211:winter2018:journals:patelk:chapter2 [2018/01/28 20:18] (current) – [2.5 A More Complex Data Structure: Priority Queues] patelk
Line 109: Line 109:
  
 Total Implementation Runtime: O(n<sup>2</sup>) Total Implementation Runtime: O(n<sup>2</sup>)
 +
 +
 +----
 +
 +
 +==== Personal Thoughts ====
 +
 +This section was relatively easy to understand as it was mostly review from CSCI 112. The refresher on lists and arrays was helpful, and was useful to review before exploring the implementation of the Stable Matching Problem. Between class lecture and this textbook section, the implementation and the associated running time makes sense. 
 +Readability: 10
 +Interesting: 8
 +
  
 ===== 2.4 A Survey of Common Running Times ===== ===== 2.4 A Survey of Common Running Times =====
Line 173: Line 184:
  
   * Total number of subsets: 2<sup>n</sup>   * Total number of subsets: 2<sup>n</sup>
-  * Outer Loop: O(2<sup>n</sup>+  * Outer Loop: **O(2<sup>n</sup>)** 
-  * Inner Loop: Checking all pairs from a set S -> O(n<sup>2</sup>+  * Inner Loop: Checking all pairs from a set S -> **O(n<sup>2</sup>)** 
-  * Total: O(n<sup>2</sup>2<sup>n</sup>)+  * Total: O(n<sup>2</sup>2<sup>n</sup>-> **O(2<sup>n</sup>)** 
 + 
 +  * n!: number of ways to match up n items with n other items (number of perfect matches in the Stable Matching Problem) 
 +  * n!: search space consists of all ways to arrange n items in order (Traveling Salesman Problem) 
 +  * There are ways to solve n! algorithms in shorter times. 
 + 
 + 
 +**Sublinear Time** 
 + 
 +  * Running times are asymptotically smaller than linear -> goal is to minimize the amount of querying necessary 
 +  * Binary Search Algorithm: shrinking the size of the region where p might be by a factor of 2 each time/probe -> after k probes, size is at most (1/2)<sup>k</sup>
 +          * For active region to be a constant: k = log<sub>2</sub>
 +          * This means search becomes constant time 
 +          * Running Time: **O(log n)** 
 + 
 +  * O(log n) tends to be the time bound when a constant amount of work happens in order to throw away a constant fraction of input 
 + 
 + 
 +---- 
 + 
 +==== Personal Thoughts ==== 
 + 
 +This section was useful to read, as I sometimes struggle to identify situations where different running times apply. The more basic running times make sense, but I did struggle a little bit to understand the more complex running times (O(n<sup>k</sup>) time, sublinear time, etc.). This section will be good to look back upon when we get into later algorithms.  
 +Readability:
 +Interesting:
 + 
 + 
 +===== 2.5 A More Complex Data Structure: Priority Queues ===== 
 + 
 +**The Problem** 
 +  * Priority Queue: Elements have a priority value (key), and each time we need to select an element from S, we want to take the one with highest priority.  
 +          - Set of elements S 
 +          - Smaller keys represent higher priority 
 +  * Good for processing real-time events such as the scheduling of processes on a computer 
 +  * Priority Queue Operations: any sequence of priority queue operations that results in the sorting of n numbers -> **O(nlogn)** 
 + 
 +**A Data Structure for Implementing a Priority Queue** 
 + 
 +  * List: have a pointer to the min; adding new elements is easy, but extracting the minimum requires O(n) scan to find the new minimum 
 +  * Sorted Array: binary search to find position of insertion + shifting all elements -> O(nlogn) 
 +  * Sorted Doubly Linked List: insertions are O(1), but O(n) to find position of insertion 
 + 
 +__The Heap:__ balanced binary tree; root + nodes with up to two children; key of any element is at least as large as the key of the parent node 
 + 
 +  * For a heap with bound N, we can use an array 
 +  * H[1] is the root 
 +  * leftChild(i) = 2i 
 +  * rightChild(i) = 2i+1  
 + 
 + 
 +**Implementing the Heap Operations** 
 + 
 +  * Identifying Minimal Element: **O(1)** 
 +  * Adding an Element: add element to the final position, then perform heapify-up recursively -> **O(logn)** 
 + 
 +{{:courses:cs211:winter2018:journals:patelk:heapify-up.png?nolink&400|}} 
 + 
 +  * Deleting an Element: move element w in position n to position i; key of element w may either be too small or too bit -> **O(logn)** 
 +  *   * If key is too small: use heapify-up to reestablish order 
 +  *   * If key is too large: use heapify-down to sway the element with one of its children 
 + 
 +{{:courses:cs211:winter2018:journals:patelk:heapify-down.png?nolink&400|}} 
 + 
 +**Implementing Priority Queues with Heaps** 
 +  * __StartHeap(N):__ returns an empty heap that is set up to store at most N elements -> **O(n)** 
 +  * __Insert(H,v):__ inserts item v into heap H -> **O(logn)** 
 +  * __FindMin(H):__ identifies the min -> **O(1)** 
 +  * __Delete(H,i):__ deletes element in position i -> **O(logn)** 
 +  * __ExtractMin(H):__ identifies and deletes minimum key element -> **O(logn)** 
 + 
 +---- 
 + 
 +==== Personal Thoughts ==== 
 + 
 +This section was pretty straightforward and easy to follow. I think going over the concepts in class before reading this section of the textbook was helpful in clarifying things that maybe would have been otherwise confusing. I appreciated the summary of the operation run times as these can sometimes be difficult to recall. 
 +Readability:
 +Interesting:
 + 
 + 
 + 
 + 
 + 
  
  
courses/cs211/winter2018/journals/patelk/chapter2.1516385242.txt.gz · Last modified: by patelk
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0