Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2018:journals:devlinn:chapter2 [2018/01/21 18:52] devlinncourses:cs211:winter2018:journals:devlinn:chapter2 [2018/01/28 02:17] (current) devlinn
Line 28: Line 28:
  
 I did not find this section as easy to follow as our discussion in class about the same decisions. If we had not worked through these questions prior to this reading I would have had a very hard time understanding their logic, since it was weakly justified. I rate this section a 5 for readability. I would also have liked a more in-depth explanation of the method to ensure O(//n//<sup>2</sup>) running time as opposed to O(//n//<sup>3</sup>), which we discussed in class. I did not find this section as easy to follow as our discussion in class about the same decisions. If we had not worked through these questions prior to this reading I would have had a very hard time understanding their logic, since it was weakly justified. I rate this section a 5 for readability. I would also have liked a more in-depth explanation of the method to ensure O(//n//<sup>2</sup>) running time as opposed to O(//n//<sup>3</sup>), which we discussed in class.
 +
 +===== 2.4 A Survey of Common Running Times =====
 +As we learned earlier, when creating algorithms a goal is to have a more efficient running time than a brute-force algorithm would have. There are several common run times that appear frequently in algorithms, including linear time, //nlogn// time, quadratic time, cubic time, and //n<sup>k</sup>//. Linear time means the running time is at most C • n, where C is a constant and n is the input size. Some examples of algorithms with linear running time is one which computes the maximum of n numbers and merging two sorted lists. In terms of algorithms that have O(//nlogn//), this can be used to describe algorithms which split their inputs into two pieces and solve each half recursively, then combine each solution in linear time. Most commonly, this procedure is seen in the merge sort algorithm. One example of an algorithm with quadratic time is one with nested loops, such as an algorithm which iterates over items of two arrays, performing a constant operation inside the loops. For cubic time, algorithms which perform more complex operations inside nested loops typically follow this pattern. An example of O(//n<sup>k</sup>//) running time is one which finds independent sets in a graph. 
 +
 +There are some running times that are worse than polynomial running time, including 2//<sup>n</sup>// and //n//!. These functions grow faster than polynomials and therefore are not ideal running times for algorithms. Algorithms which search through all subsets have a running time of 2//<sup>n</sup>//. Algorithms which pair //n// items with //n// items or those which consider the ways to arrange //n// items in order have a running time of //n//!, which is worse than 2//<sup>n</sup>//. In addition, some algorithms run in log//n// time, which is slightly better than linear time.
 +
 +===== 2.5 A More Complex Data Structure: Priority Queues =====
 +A priority queue is considered one of the most applicable data structures for use in algorithms. These structures are useful when a group of elements have a priority, which we call a key, and we seek to obtain the element with the highest priority. Ideally, we would hope to implement a priority queue which can perform insertion and deletion operations with O(//logn//) running time and can sort //n// numbers in O(//nlogn//) running time. A heap can be used to implement a priority queue, which means each element in the priority queue at node //i//, the element at //i//'s parent has a key that is less than or equal to the element. It is important to node that for a heap H, H[1] is the root. In addition, for a given position //i//, the left child for //i// is 2//i// and the right child of //i// is 2//i// + 1.
 +
 +In order to maintain a heap's structure after the insertion of an element, the algorithm Heapify-up will be useful. This algorithm can be roughly sketched as follows:
 +    Heapify-up(//H//, i):
 +        if i > 1 then
 +            let j = parent(i) = ⌊i/2⌋ 
 +            if key[//H//[i]] < key[//H//[j]] then
 +                swap array entries //H//[i] and //H//[j]
 +                Heapify-up(//H//,j)
 +            endif
 +        endif
 +
 +Deleting an element must also consider how to maintain the order in the heap. The algorithm of Heapify-down can be useful for this, which is sketched as follows:
 +    Heapify-down(//H//, i):
 +        let n = length(//H//)
 +        if 2i > //n// then
 +            terminate with //H// unchanged
 +        else if 2i < //n// then
 +            let left = 2i and right = 2i + 1
 +            let j be the index that minimizes key[//H//[left]] and key[//H//[right]]
 +        else if 2i = //n// then
 +            let j = 2i
 +        endif
 +        if key[//H//[j]] < key[//H//[j]] then
 +            swap array entries //H//[i] and //H//[j]
 +            Heapify-down(//H//, //j//)
 +        endif
  
courses/cs211/winter2018/journals/devlinn/chapter2.1516560761.txt.gz · Last modified: by devlinn
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0