This is an old revision of the document!


Chapter 6

Dynamic Programming

divide and conquer - use sub problems and build up to correct sultion.

6.1 Weighted Interval Scheduling A Recursive Prosedure

Designing a Recursive Algorithm

  • need a base case
  • then a step that makes it smaller, then you call the algorithm again.

Weighted interval scheduling

  • choose if we do the job or not
  • If we choose the job we run the algorithm on the compatiable jobs
  • if we don't choose the job j we look at the problem j-1
  • key is we need a memoization array - to make the problem more efficient. (THIS COMES IN 255-257
  • 6.1 OPT(J) = max(Vj+opt(p(j), opt(j-1))
  • 6.2 Request j belons to the optimal solution on the set if and only if vj+opt(p(j)) >= opt(j-1)
  • algorithm on p. 254
  • NOW that we have the value of the optimal solution need to do post work to find the actual solution.

Computing the optimal solution

  • 6.5 given the array M of the optimal values of the sub-problems Find Solution returns an optial solution in O(n)

6.2 Principles of Dynamic Programming: Memoization or iteration over subprobs

  • Memoization - array details. 259
  • can also do iterative algorithms

A basic outline of Dynamic Programming

  • There are only a polynoimal number of sub problems
  • the solution to the original problem can be easily computed from the solutions to the subproblems
  • there is a natural ordering on subproblems from smallest to largest together with an easy to compute recurrence and that allows one to determine the solution to a subproblem from the solution to some number of smaller subproblems.

6.3 Segmented Least Squares: Multi way Choices

The problem

  • minimize the sum of the distances the points are from the line, by using more than one line. (Find the lines of best fit)
  • instead of having a binary choice we have a multi way choice. do we do one line from point i to j or a line from i-2 to j. Each line has a cost also though so it can't just be number lines.

Designing the alogirthm

  • see the algorithm on page 265
  • 6.6 if the last segment of the optimal partition is pi …. pn then the value of the optimal solution is OPT(N) = ei,n + c + opt(i-1)
  • 6.7 for the sub problem on the points pi…pj opt(j) = min(ei,j + C + opt(i-1)), and the segment pi…pj is used in the optimal solution for the subproblem if and only if the minimum is obtained using index i.

6.4 Subset Sums and Knapsacks: adding a var

The problem

  • n items
  • knapsack can hold W total weight
  • each item i has Wi
  • each item i also has a value to the person
  • want to max value which staying under W
  • key is using memoization array
  • second key is many many sub problems

Designing an algorithm

  • algorithm on 6.4
  • 6.8 if W < Wi then opt(i, W) = opt(i-1, W) Otherwise Opt(i,W) = max(opt(i-1, W), Wi + opt(i-1, w-wi))
  • 6.9 the subset-sum (n,W) algorithm correctly computes the value of the problem and runs in O(nW) time.
  • 6.10 given a table M of the optimal values of the subproblems, the optimal set S can be found in O(n) time.
  • (AGAIn need to do post back work to get the solution.)

There is an expansion section on the knapsack problem that looks at a different restriction on weight.

courses/cs211/winter2012/journals/carrie/ch6.1332697958.txt.gz · Last modified: 2012/03/25 17:52 by hopkinsc
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0