Differences
This shows you the differences between two versions of the page.
courses:cs211:winter2012:journals:richard:chap6 [2012/03/28 03:07] – created marmorsteinr | courses:cs211:winter2012:journals:richard:chap6 [2012/03/28 03:22] (current) – marmorsteinr | ||
---|---|---|---|
Line 10: | Line 10: | ||
Then, they write the recursive algorithm like that, but uh oh...turns out to be exponential because you have to recalculate the value every time. But then comes along memoization to save the day, which just stores the value of every run of OPT(j) so it doesn' | Then, they write the recursive algorithm like that, but uh oh...turns out to be exponential because you have to recalculate the value every time. But then comes along memoization to save the day, which just stores the value of every run of OPT(j) so it doesn' | ||
+ | Readability: | ||
==6.2== | ==6.2== | ||
This section just points out that the whole recursion thing is unnecessary. You can't compute the final values of the solution (the end of the memoization array) unless you have the first value--so you might as well quit trying to start from the end and doing all those backwards calls--just start constructing the array iteratively from the beginning. | This section just points out that the whole recursion thing is unnecessary. You can't compute the final values of the solution (the end of the memoization array) unless you have the first value--so you might as well quit trying to start from the end and doing all those backwards calls--just start constructing the array iteratively from the beginning. | ||
Line 18: | Line 19: | ||
"(iii) There is a natural ordering on subproblems from " | "(iii) There is a natural ordering on subproblems from " | ||
+ | Readability: | ||
==6.3== | ==6.3== | ||
Line 29: | Line 30: | ||
OPT(j) = min {1< | OPT(j) = min {1< | ||
- | The algorithm is possible to run in O(n^2). | + | The algorithm is possible to run in O(n^2). |
+ | |||
+ | Readability 10/10 (I was surprised it was so easy to follow even though I skipped lecture) | ||
==6.4== | ==6.4== | ||
+ | This section introduces the knapsack problem, but first a scheduling problem where you want to pack as many jobs with given weights onto a machine so it's as busy as possible. They demonstrate that some intuitive greedy solutions don't work, and then try doing it the same way as they solved Weighted Interval scheduling--but that doesn' | ||
+ | The knapsack problem is a little different, because instead of just trying to fill up something to the max, each item you have has a value, and you're trying to maximize value. The subset sums is pretty much a special case of the knapsack problem where the value of every item is equal to its weight. | ||
- | + | Readability: |