We seek algorithms that improve qualitatively on brute-force search, and in general we use polynomial-time solvability. Priority queues will be useful when we are implementing the Stable Matching problem.
The Problem We discussed ho we need to maintain a dynamically changing set S. In such cases, we want to be able to add, delete, and select elements from S when the algorithm calls it. A priority queue has a priority value, or key, and each time we need to select an element from S, we want to take the one with highest priority.
A PQ is a data structure that maintains a set of elements S, in which every element has an associated value key. A motivating application of PQs is the problem of managing real-time events such as the scheduling of processes on a computer. Each process has a priority or urgency, but processes do not arrive in order of their priorities. We can maintain the set of processes in a PQ, with the key of a process representing its priority value. Scheduling the highest-priority process corresponds to selecting the element with minimum key from the priority; we also be inserting new processes as they arrive, according to their priority values.
How effectively do we hope to be able to execute the operations in a priority queue? –>We will show how to implement a PQ containing at most n elements at any time so that elements can be added and deleted, and the element with minimum key selected, in O(log n)time per operation.
Heaps
Heaps are data structures used for implementing PQs. A heap combines the benefits of a sorted array and a list for purposes of this application. Conceptually, we think of it as a balanced binary tree. Condition: It is a minimum heap, which means, every node's value is smaller or at least equal to its children. We can either use pointers or an array to represent a heap. At every node, i, its left child is at position 2*i and right child at (2*i) + 1. This means a parent of any node i, is at the floor value of (i/2).
Implementing Heap Operations. The smallest element of the heap is at the root of the tree. What is the cost and process of adding or deleting elements from the binary tree? * We shall add an element by adding it to the end of the tree, but this might not maintain the heap property. Thus we would have to “heapify-up” the tree to keep this property. Meaning if the new element,i's parent is larger, swap i with its parent until this condition is false.Heapify-up can be found on page 61.
Heapify-up fixes the heap property in O(log i) time, assuming that the array H is almost a heap with the key of H[i] too small. Using Heapify-up we can insert a new element in a heap of n elements in O(log n) time.
ExtractMin(H) will basically delete an element from the Heap. The question is how do we remove an element from the middle of a tree? We will have to delete the element at position i. Secondly, we have to fill that gap with the last element of the heap. If the new element at position i is too small, we Heapify-up. What if it is too big?? We then Heapify-down. Heapify-down will swap element at node i with its smallest child and proceeds down recursively. Heapify-down can be found on page 63. Heapify-down(H, i) fixes the heap property in O(log n) time, assuming that H is almost a heap with the key value H[i] too big. Using Heapify-ip or Heapify-down, we can delete a new element in a heap of n elements in O(log n) time.
Implementing PQs with Heaps
- StartHeap(N) –> O(n)
- Insert(H, v) –> O(log n)
- FindMin(H) –> O(1) Only finds the element but does not remove it.
- Delete(H, i) –> O(log n) Deletes the element in heap position i.
- ExtractMin(H) –> O(log n) Identifies and deletes an element with minimum key value from a heap.
The position array stores the current position of each element in the heap. We can now implement Delete, thus maintaining the array does not change the overall running time. This can still be done in O(log n) time. We can also implement ChangeKey that changes a key value of an element. This also will take O(log n) time.
Question is, can we always use Heaps as the most preferred data structure to implement algorithms that run as fast as the Stable Matching problem's algorithm?
Is it possible to only use half of the tree(use the last child of the right hand side of a node to be deleted) when implementing Heapify-up and Heapify-down?
This section was pretty easy to understand, thus I give it a 9/10 in terms of readability.