Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
courses:cs211:winter2014:journals:deirdre:chapter2 [2014/01/22 03:31] – [Section 2.5 - A More Complex Data Structure: Priority Queues] tobind | courses:cs211:winter2014:journals:deirdre:chapter2 [2014/01/22 04:03] (current) – [Section 2.4 - A Survey of Common Running Times] tobind | ||
---|---|---|---|
Line 79: | Line 79: | ||
There are cases where one encounters running times that are asymptotically smaller than linear. Since it takes linear time just to read the input, these situations tend to arise in a model of computation where the input can be " | There are cases where one encounters running times that are asymptotically smaller than linear. Since it takes linear time just to read the input, these situations tend to arise in a model of computation where the input can be " | ||
- | + | I give this section a 8.5. | |
- | + | ||
====== Section 2.5 - A More Complex Data Structure: Priority Queues ====== | ====== Section 2.5 - A More Complex Data Structure: Priority Queues ====== | ||
**The Problem** | **The Problem** | ||
Line 95: | Line 93: | ||
We will use a data structure called a heap to implement a pq. The heap data structure combines the benefits of a sorted array and list for purposes of this application. Conceptually, | We will use a data structure called a heap to implement a pq. The heap data structure combines the benefits of a sorted array and list for purposes of this application. Conceptually, | ||
Heap order: For every element v, at a node i, the element w at i's parent satisfies key(w) ≤ key(v). | Heap order: For every element v, at a node i, the element w at i's parent satisfies key(w) ≤ key(v). | ||
- | Before we discuss how to work with a heap, we need to consider what data structure should be used to represent it. We can use pointers: each node at the heap could keep the element it stores, its key and three pointers pointing to the two children and the parent of the heap node. We can avoid using pointers, however, if a bound //N// is known in advance on the total number of elements that will ever be in the heap at any one time. Such heaps can be maintained in an array //H// indexed by //i// = 1, | + | Before we discuss how to work with a heap, we need to consider what data structure should be used to represent it. We can use pointers: each node at the heap could keep the element it stores, its key and three pointers pointing to the two children and the parent of the heap node. We can avoid using pointers, however, if a bound //N// is known in advance on the total number of elements that will ever be in the heap at any one time. Such heaps can be maintained in an array //H// indexed by //i// = 1, |
+ | |||
+ | **Implementing the Heap Operations** | ||
+ | The heap element with smallest key is at the root, so it takes //O(1)// time to identify the minimal element. | ||
+ | |||
+ | Consider adding a new heap element //v// and assume that our heap //H// has //n < N// elements so far. We can add the new element //v// to the final position //i = n + 1// by setting H[//i//] = //v//. Unfortunately, | ||
+ | | ||
+ | If i > 1 then | ||
+ | let j = parent(i) = (i/2) | ||
+ | If key[H[i]] < key[H[j]] then | ||
+ | swap the array entries H[i] and H[j] | ||
+ | | ||
+ | Endif | ||
+ | Endif | ||
+ | To see why ^that works and eventually fixes the heap, let's look more fully at the not-fixed heap. Assume that //H// is an array and //v// is the element in position //i//. We say that "H is almost a heap with the key of //H[i]// too small" if there is a value α ≥ key(//v//) such that raising the value of key(//v//) to α would make the resulting array satisfy the heap property. | ||
+ | -Heapify-up(// | ||
+ | |||
+ | Consider deleting an element. After deleting an element at position //i//, there will be a " | ||
+ | | ||
+ | Let n = length(H) | ||
+ | If 2i > n then | ||
+ | | ||
+ | Else if 2i < n then | ||
+ | Let left = 2i and right = 2i + 1 | ||
+ | Let j be the index that minimizes key[H[left]] and key[H[right]] | ||
+ | Else if 2i = n then | ||
+ | Let j = 2i | ||
+ | | ||
+ | If key[H[j]] < key[H[i]] then | ||
+ | swap the array entries H[i] and H[j] | ||
+ | | ||
+ | | ||
+ | Assume that //H// is an array and //w// is the element in postion //i//. We say that "H is almost a heap with the key of //H[i]// too big", if there is a value α ≤ key(//w//) such that lowering the value of key(//w//) to α would make the resulting array satisfy the heap property. | ||
+ | - The procedure Heapifydown(// | ||
+ | |||
+ | **Implementing PQs with Heaps** | ||
+ | The heap data structure with the Heapify-down and Heapify-up operations can efficiently implement a pq that is constrained to hold at most //N// elements at any point in time. Here is a summary of operations we will use: | ||
+ | * StartHeap(// | ||
+ | * Insert(//H, v//) inserts the item //v// into heap //H//. If the heap currently has //n// elements, this takes //O//(log //n//) time. | ||
+ | * FindMin(// | ||
+ | * Delete(//H, i//) deletes teh element in heap position //i//. This is implemented in //O//(log //n//) time for heaps that have //n// elements. | ||
+ | * ExtractMin(// | ||
+ | |||
+ | I thought this section was really clear and helped my understanding a lot. On a readibility/ | ||