Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/29 04:09] – holmesr | courses:cs211:winter2018:journals:holmesr:section_2.1 [2018/01/29 19:21] (current) – holmesr | ||
|---|---|---|---|
| Line 35: | Line 35: | ||
| I found this section easy to read and I appreciated how it began with a review of the unique characteristics of each data type and then discussed the steps that occurred inside the loop, including details that ushered the reader towards the implementation of one data structure or the other. The final section with rigorous explanations of why one data type was preferred over the other for each step helped show what to look for in components of other algorithms when trying to decide which data structures to implement. | I found this section easy to read and I appreciated how it began with a review of the unique characteristics of each data type and then discussed the steps that occurred inside the loop, including details that ushered the reader towards the implementation of one data structure or the other. The final section with rigorous explanations of why one data type was preferred over the other for each step helped show what to look for in components of other algorithms when trying to decide which data structures to implement. | ||
| - | ====== Section 2.4 A Survey or Common Running Times ====== | + | ===== Section 2.4 A Survey or Common Running Times ===== |
| The first common running time treated in this section is O(n) running time. This is sometimes described as a " | The first common running time treated in this section is O(n) running time. This is sometimes described as a " | ||
| Line 48: | Line 48: | ||
| This chapter was very straightforward and easy to read due to it discussion of running times, which I have studied before. | This chapter was very straightforward and easy to read due to it discussion of running times, which I have studied before. | ||
| + | |||
| + | ===== Section 2.5 Priority Queues ===== | ||
| + | |||
| + | Statement 2.11 from the book says that "a sequence of O(n) priority queue operations can be used to sort a set of n numbers" | ||
| + | |||
| + | Regardless, we will use a heap to implement our priority queue. It helps to visualize a heap as a balanced binary tree with a root node and each node having two or fewer child nodes. The heap will be represented in an array with an object at position i, its left child at 2i and its right child at 2i+1. Importantly, | ||
| + | |||
| + | Maintaining the heap order in an array during additions and deletions requires special algorithms that rearrange the nodes and allow for insertion and deletion in O(log n) time. First, to add an item to the heap, it is inserted into position n + 1 (for a heap of size n) and then Heapify-up is called. Heapify-up checks whether the recently inserted value is less than the value of it parent node and if so, it swaps the two and recursively calls itself. This maintains heap order by pushing lower values towards the top of the array. To delete an item in a heap and maintain heap order, the item at position n is placed in the array where the item to be deleted had been, and then Heapify-down is called. As one would expect, Heapify-down works in the opposite direction of Heapify-up by comparing a parent to its children and swapping the parent with the lower child if possible. This algorithm also calls itself recursively until heap order has been restored. | ||
| + | |||
| + | Some important running times to note when implementing priority queues with heaps: | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
| + | |||
| + | * '' | ||
