We just worked out a solution to a recurrence relation. We shall now consider a class of recurrence relations that generalizes a recurrence relation and shows how to solve it. The more general class of algorithms is obtained by considering divide-and-conquer algorithms that create recursive calls on q sub problems of size n/2 each, and then combine these solutions in O(n) time. A general recurrence relation used to figure out an algorithm's run time is: T(n) < = qT(n/2) + cn.
We shall now see how to solve the above recurrence by these methods: unrolling, substitution, and partial substitution. Cases q=1 and q>2 are different and shall be considered separately and q=2 as well.
The Case of q>2 Sub problems. Process of figuring out the run time.
- Analyzing the first few levels: At first level of recursion, we have a single problem of size n and takes at most cn time, at the next level we have q problems each of size n/2, each taking cn/2 time at most, and the next level yields q^2 problems of size n/4 each, for a total time of (q^2/4)cn time at most.
- Identifying a pattern: At an arbitrary level j, we have q^j distinct instances, each of size n/(2^j). Thus the total work at each level j is (q/2)^j*cn.
- Summing over all levels of recursion: There are log n levels of recursion, and the total amount of work performed is the sum over all these levels which yields an asymptotic run time of O(n^log q(base 2)). Thus, any function satisfying the recurrence above with q>2 is bounded by this asymptotic run time.
Applying Partial Substitution. Another method of figuring out the recurrence relation. Suppose when q>2, the solution to have the form T(n) < = kn^d for some constants k>0 and d>1. This is quite a general guess, starting with the inductive argument we still reach the same solution. The Case of One Subproblem.
- Analyzing first steps: First level, n problems, taking cn time. Second level, problem of size n/2, which contributes cn/2. and the level after has a prblem of size n/4 that takes cn/4 time.
- Identifying a pattern: at a level j, we have one instance of size n/2^j and contributes cn/2^j to the running time.
- Summing over all recursions we get an asymptotic run time of O(n).
Although the algorithm is performing log n levels of recursion, the overall run time is bounded by O(n). Basically, when q=1, the resulting time is linear
q=2, the resulting time is O(n log n) q>2, Its a polynomial bound with an exponent larger than 1.
A related recurrence: T(n)⇐ 2T(n/2)+O(n^2) The difference between the two recurrence relation is that this one takes quadratic time to combine the subproblems. And the amount of work per level is larger by a factor equal to the input size.
I enjoyed reading through this section and understanding more on finding the recurrence relation of an algorithm. I give it a 9/10 of scale.