Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
courses:cs211:winter2014:journals:emily:entrysix [2014/03/05 02:09] – [Chapter 4.8] hardye | courses:cs211:winter2014:journals:emily:entrysix [2014/03/05 02:49] (current) – [Chapter 5.1] hardye | ||
---|---|---|---|
Line 96: | Line 96: | ||
Lock the two lowest frequencies together into a metaletter...recursive strategy: | Lock the two lowest frequencies together into a metaletter...recursive strategy: | ||
+ | |||
+ | **Huffman' | ||
To construct a prefix code fo an alphabet S, with given frequencies: | To construct a prefix code fo an alphabet S, with given frequencies: | ||
Line 109: | Line 111: | ||
Endif | Endif | ||
| | ||
+ | The prefix code this algorithm produces is called Huffman code. This is a bottom-up approach by focusing on leaves with two lowest frequencies and moving up to leaves with highest frequencies. | ||
+ | |||
+ | **Analysis: | ||
+ | |||
+ | We prove the optimality of Huffman' | ||
+ | |||
+ | This section was realllllly long to read which made it more boring, but the presentation of it in class was much more interesting. The proof was definitely a part of the chapter that I will have to come back to to understand. Four pages of a lot of variables was intimidating. I would rate this readability: | ||
+ | |||
+ | ====== Chapter 5.1 ====== | ||
+ | |||
+ | **A First Recurrence: The Mergesort Algorithm** | ||
+ | |||
+ | In this section we are beginning divide-and-conquer algorithms!! | ||
+ | |||
+ | Mergesort | ||
+ | * sorts given list of numbers | ||
+ | * divides list in two equal parts | ||
+ | * sort each half recursively | ||
+ | * combine the results spending linear time for the initial division and final recombining | ||
+ | |||
+ | Let T(n) be the worst-case running time of the algorithm on a list of size n. To divide the list into two even parts (n/2) takes O(n) time. For each division it takes T(n/2) to solve the sorting. It then spends O(n) to combine the lists back together. SO.... | ||
+ | |||
+ | **(5.1)** For some constant c, T(n) =< 2T(n/2) + cn when n > 2 and T(2) =< c. | ||
+ | |||
+ | T(n) is bounded by an inequality. There is a base case that says T(n) is equal to a constant when n is a constant. We have to solve the recurrence so that T is only on the LHS of the inequality. | ||
+ | |||
+ | How to solve recurrences: | ||
+ | * " | ||
+ | * start with a guess and check if it works by substituting it into the recursion. Justify by induction! | ||
+ | |||
+ | **Unrolling the Mergesort Recurrence** | ||
+ | * level 1: problem size n, takes cn time | ||
+ | * level 2: two problems size n/2 each takes cn/2 time for a total of at most cn | ||
+ | * level 3: four problems size n/4 each take cn/4 time for total of at most cn | ||
+ | * pattern: at level j of the recursion the number of subproblems is 2< | ||
+ | * sum over all levels: number of times the input is halved is log n so the sum of the cn work on log n levels is O(n log n) | ||
+ | |||
+ | **Substituting a Solution into the Mergesort Recurrence** | ||
+ | I would rate this chapter a readability of 8/10. It was fairly short and easy to follow but if I hadn't read this chapter after it was presented in class I would have been a little lost. |