Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
courses:cs211:winter2011:journals:andrew:chapter6 [2011/03/14 20:42] – created bennetta | courses:cs211:winter2011:journals:andrew:chapter6 [2011/04/05 23:24] (current) – bennetta | ||
---|---|---|---|
Line 5: | Line 5: | ||
* Dynamic programming solution: a recurrence equation that expresses the optimal solution (or its value) in terms of the optimal solutions to smaller sub-problems | * Dynamic programming solution: a recurrence equation that expresses the optimal solution (or its value) in terms of the optimal solutions to smaller sub-problems | ||
* Memoization: | * Memoization: | ||
- | * Saving values that have already been computed to reduce run time. | + | * Saving values that have already been computed to reduce run time. |
+ | * Analysis on 257 | ||
+ | |||
+ | ===== 6.2: Principles of Dynamic Programming: | ||
+ | | ||
+ | * Deals with using the array M from the Memoization/ | ||
+ | * We can directly compute the entries in M by an iterative algorithm, rather than using memoized recursion. | ||
+ | * Analysis on 259. | ||
+ | * Second approach to dynamic programming: | ||
+ | * Subproblems for this approach my satisfy the following properties: | ||
+ | * There are only a polynomial number of subproblems | ||
+ | * The solution to the original problem can be easily computed from the solutions to the subproblems | ||
+ | * There is a natural ordering on the subproblems from " | ||
+ | |||
+ | ===== 6.3: Segmented Least Squares: Multi-way Choices ===== | ||
+ | | ||
+ | | ||
+ | | ||
+ | * The number of segments into which we partition P, times a fixed, given multiplier C>0 plus the error value of the optimal line through each segment | ||
+ | * Design and analysis of this segmented least squares problem can be found from 264-266 | ||
+ | |||
+ | ===== 6.4: Subset Sums and Knapsacks: Adding a Variable ===== | ||
+ | * Given a set of items, each with a given weight w and a bound for how much we can carry W | ||
+ | * Knapsack problem: Find a set of items that maximizes value and weight. | ||
+ | * Creation and analysis of the optimal algorithm for the knapsack problem begins on page 269 through page 271 | ||
+ | * Knapsack problem can be solved in O(nW) time where n is the number of items that can be put in the sack and W is the weight | ||
+ | |||
+ | ===== Final Thoughts (End of Chap 5, Beg of 6) ===== | ||
+ | This chapter is a little bit more easily understood than last weeks chapter. All in all, the knapsack problem is very intuitive and so is the idea of dynamic programming. Readability: | ||
+ | |||
+ | ===== 6.5: RNA Secondary Structure: Dynamic Programming over Intervals ===== | ||
+ | * Adding a second variable to consider a subproblem for every contiguous interval in {1, | ||
+ | * RNA secondary structure prediction is a great example of this problem. | ||
+ | * Secondary structure occurs when RNA loops back and forms pairs with itself. | ||
+ | * Design of an algorithm to predict secondary structure of RNA can be found from 275-278 | ||
+ | |||
+ | ===== 6.6: Sequence Alignment ===== | ||
+ | *How do we define similarity between two words or strings? | ||
+ | | ||
+ | | ||
+ | | ||
+ | | ||
+ | *The cost of alignment M is the sum of its gap and mismatch costs. | ||
+ | | ||
+ | |||
+ | ===== 6.7: Sequence Alignment in Linear Space via Divide and Conquer ===== | ||
+ | * Must get around the O(mn) space requirement | ||
+ | * This chapter covers making it work in O(mn) time using O(m + n) space | ||
+ | * Page 285-290 covers the design and analysis of this algorithm | ||
+ | |||
+ | ===== 6.8: Shortest Paths in a Graph ===== | ||
+ | *This is the section I understand the most. | ||
+ | | ||
+ | | ||
+ | | ||
+ | *We can modify Dijkstra' | ||
+ | *The design, analysis, and implementation of this algorithm starts on page 291 | ||
+ | |||
+ | ===== 6.5-6.8 Final Words ===== | ||
+ | This is a section that I did not understand too particularly well both in the book and in class. The shortest path portion was probably my strongest area but I'm struggling a little bit with this material. Glad I did this journal on Monday so I know to get some extra help on this material before the problem set is due on Friday. Readability: | ||
+ | |||
+ | ===== 6.9: Shortest Paths and Distance Vector Protocols ===== | ||
+ | | ||
+ | * Nodes are routers and edges are direct paths between these routers. | ||
+ | * Find minimum delay from a source node s to a destination node t. | ||
+ | * Cannot use Dijkstra' | ||
+ | * Bellman-Ford give us the best option | ||
+ | * Use a " | ||
+ | * This pushed based method can be seen on page 298 and an Asynchronous version can be found on page 299\ | ||
+ | * Problems: | ||
+ | * Assumes edge costs will remain constant | ||
+ | * Can cause counting to infinity | ||
+ | * For this reason path vector protocols are better than distance vector protocols | ||
+ | |||
+ | |||
+ |