Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2014:journals:emily:entrysix [2014/03/05 02:09] – [Chapter 4.8] hardyecourses:cs211:winter2014:journals:emily:entrysix [2014/03/05 02:49] (current) – [Chapter 5.1] hardye
Line 96: Line 96:
  
 Lock the two lowest frequencies together into a metaletter...recursive strategy: Lock the two lowest frequencies together into a metaletter...recursive strategy:
 +
 +**Huffman's Algorithm:**
  
     To construct a prefix code fo an alphabet S, with given frequencies:     To construct a prefix code fo an alphabet S, with given frequencies:
Line 109: Line 111:
       Endif       Endif
              
 +The prefix code this algorithm produces is called Huffman code. This is a bottom-up approach by focusing on leaves with two lowest frequencies and moving up to leaves with highest frequencies.
 +
 +**Analysis:**
 +
 +We prove the optimality of Huffman's by induction. The proof is really long and complicated but follows to prove by contradiction that the tree Huffman's produces is optimal. The total runtime of Huffman's algorithm is O(k log k). If we use a PQ by a heap we can inset and extract the minimum in O(log k) time which we do a total of k times. 
 +
 +This section was realllllly long to read which made it more boring, but the presentation of it in class was much more interesting. The proof was definitely a part of the chapter that I will have to come back to to understand. Four pages of a lot of variables was intimidating. I would rate this readability: 7/10
 +
 +====== Chapter 5.1 ======
 +
 +**A First Recurrence: The Mergesort Algorithm**
 +
 +In this section we are beginning divide-and-conquer algorithms!!
 +
 +Mergesort
 +  * sorts given list of numbers
 +  * divides list in two equal parts
 +  * sort each half recursively 
 +  * combine the results spending linear time for the initial division and final recombining
 +
 +Let T(n) be the worst-case running time of the algorithm on a list of size n. To divide the list into two even parts (n/2) takes O(n) time. For each division it takes T(n/2) to solve the sorting. It then spends O(n) to combine the lists back together. SO....
 +
 +**(5.1)** For some constant c, T(n) =< 2T(n/2) + cn when n > 2 and T(2) =< c.
 +
 +T(n) is bounded by an inequality. There is a base case that says T(n) is equal to a constant when n is a constant. We have to solve the recurrence so that T is only on the LHS of the inequality. 
 +
 +How to solve recurrences:
 +  * "unroll" the recursion by accounting for the run time in the first few levels then specify a pattern for as the recursion expands. Sum the runtimes over all levels 
 +  * start with a guess and check if it works by substituting it into the recursion. Justify by induction!
 +
 +**Unrolling the Mergesort Recurrence**
 +  * level 1: problem size n, takes cn time 
 +  * level 2: two problems size n/2 each takes cn/2 time for a total of at most cn
 +  * level 3: four problems size n/4 each take cn/4 time for  total of at most cn
 +  * pattern: at level j of the recursion the number of subproblems is 2<sup>j</sup> with size n/2<sup>j</sup> so takes at most cn/2<sup>j</sup> time
 +  * sum over all levels: number of times the input is halved is log n so the sum of the cn work on log n levels is O(n log n)
 +
 +**Substituting a Solution into the Mergesort Recurrence**
  
 +I would rate this chapter a readability of 8/10. It was fairly short and easy to follow but if I hadn't read this chapter after it was presented in class I would have been a little lost.
courses/cs211/winter2014/journals/emily/entrysix.1393985359.txt.gz · Last modified: 2014/03/05 02:09 by hardye
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0