This is an old revision of the document!
Table of Contents
Overview
A good recap before moving on:
Algorithmic Paradigms • Greedy. Build up a solution incrementally, myopically optimizing some local criterion • Divide-and-conquer. Break up a problem into sub-problems, solve each sub-problem independently(??kind of unsure here, we also did the version of overlapping subproblems with q subproblems of size n/2),and combine solution to subproblems to form solution to original problem • Dynamic programming. Break up a problem into a series of overlapping subproblems, and build up solutions to larger and larger sub-problems
Essence here one implicitly explores the space of all possible solutions, by carefully decomposing things into a series of subproblems, and then building up correct solutions to larger and larger subproblems.
6.1 Weighted Interval Scheduling:A Recursive Procedure
Summary Previous interval scheduling problem is just a special case of this problem when all the weights are 1: Greedy won't work here.
Dynamic Programming
Memoization Process: Create a table with the possible inputs • If the value is in the table, return it, without recomputing it • Otherwise, call function recursively
Problem: Goal: find maximum weight subset of mutually compatible jobs Assume we have an optimal solution • OPT(j) = value of optimal solution to the problem consisting of job requests 1, 2, …, j Then at every step there are two choices:
OPT selects job j OPT does not select job j
that makes the algo easy.
case 1:Opt(j) = vj + Opt(p(j)) case 1:Opt(j) = Opt(j-1)
Memoization. Store results of each subproblem in a cache; lookup as needed.
Example run through on slide.
interesting/readable: 7/7
6.2 Principles of Dynamic Programming: Memoization or Iteration over Subproblems
Summary:
A Basic Outline of Dynamic Programming:
From slides:
1.Determine the optimal substructure of the problem define the recurrence relation 2. Define the algorithm to find the value of the optimal solution 3. Optionally, change the algorithm to an iterative rather than recursive solution 4. Define algorithm to find the optimal solution 5. Analyze running time of algorithms
The big idea: Memorization
interesting/readable: 7/7
6.3 Segmented Least Squares: Multi-way Choices
Summary
The Problem: Formal description Suppose our data consists of a set P of rt points in the plane, (xn, Yn); and suppose x1 < x2 < .. ¯ < xn. Given a line L defined by the equation y = ax + b, we say that the error of L with respect to P is the sum of its squared “distances” to the points in P:
Simply put, we want some lines that minimizes the total cost of the line plus the error.
Properties that make things a little simper: At a certain point, the optimal solution include the point or does not include the point.
some definitions:
OPT(j) = minimum cost for points from 1 up to point j e(i, j) = minimum sum of squares for points from i to j
At the begginning, nothing was included cost = 0 At certian point j, Cost = e(i, j) + c + OPT(i-1). e(i, j): the cost of error from i to j c: cost of adding this line from i to j OPT(i-1): all the cost before i.
interesting/readable: 7/7