Table of Contents

2.4 A survey of Common Running Times

When analyzing algorithms,we need to have an approximate sense of the “landscape” of different running times. Most problems have a natural “search space”,which is the set of all possible solutions.When we say we are looking for an efficient algorithm, our goal is actually that of finding algorithms that are faster than the brute-force enumeration of the search space. Thus we always need to think about two bounds when analyzing algorithms: the one on the natural “search space”,which is thus that of the brute-force algorithm for the problem,and the one of the running time we hope to achieve.This section discusses the most common running times of algorithms.

Linear Time O(n)

The running time of these algorithms is at most a constant factor times the size of the input. With these algorithms, the input is typically processed in a single pass, spending a constant amount of time on each item of input enumerated. Examples:

algorithm:

        max = a <sub>1</sub>  \\ 
        for i = 2 to n:  \\ 
            if a<sub> i</sub> is greater than max then  \\ 
                  set max = a<sub> i</sub>  \\ 
            Endif  \\ 
        Endfor \\ 

Merging Two Sorted lists

algorithm:

  To merge sorted lists A = a<sub>1</sub>,a<sub>2</sub>,a<sub>3</sub>,...,a<sub>n</sub> and B = b<sub>1</sub>,b<sub>2</sub>,b<sub>3</sub>,...,b<sub>n</sub>:
  Maintain a //current// pointer into each list,initialized to point to the front elements
  While both lists are nonempty:
     Let a<sub>i</sub> and b<sub>j</sub> be the elements pointed to by the //current// pointer
     Append the smaller of these two to the output list
     Advance the //current// pointer in the list from which the smaller element was selected
  EndWhile
  Once one list is empty, append the remainder of the other list to the output.
 

To show that this algorithm spends a constant amount of time on each element:

        Suppose the cost of each iteration  is charged to each element that is selected and added to the list \\
        An element can be charged only once,since at the moment it is first charged,it is added to the output and never seen by the algorithm.\\
        But there are only 2//n// elements in  total and the charge of each iteration is accounted for by a charge of some element\\
        Thus there can be only 2//n// iterations \\
        Each iteration takes a constant amount of time,so the in total, the algorithm takes O(//n//)\\

O(nlogn) Time

It's the running time of any algorithm that splits the input into two equal-sized pieces,solves each piece recursively and then combines the two solutions in linear time. Examples: Mergesort; finding the largest interval of time between the first and the last of the time-stamps during which no copy of a file arrived to a given server,…

Quadratic Time O(n²)

Often times, for problems involving pairs of things. Example: Given n points in space, each specified by (x,y), find the pair of points that are closest together. The brute-force algorithm for this problem would enumerate all pairs of points,compute the distance between each pair, and then choose the pair for which this distance is smallest. The number of pairs is n2 because there are n possible ways to choose the first element and n -1 possible ways to choose the second element(almost n).

Algorithm: For each input (xi,yj)

               For each other input point (x<sub>j</sub>,y<sub>j</sub>)
                   compute d = √((x<sub>i</sub> - x<sub>j</sub>)<sup>2</sup> + ( y<sub>i</sub> - y<sub>j</sub>)<sup>2</sup>
                   if d is less than the current minimum,update minimum to d
               Endfor
           Endfor

Another obvious instance of the Quadratic running time is the result of using a pair of nested loop within an algorithm,each loop taking O(n) time.

Cubic Time O(n³)

Example: Given two sets,find whether some pair of the given sets is disjoint

algorithm:

for each set Si

  for each other set S<sub>j</sub> \\ 
     for each element //p// of S<sub>i</sub>  \\ 
       Determine whether //p// also belongs to S<sub>j</sub> \\
     Endfor  \\
     if no element of S<sub>i</sub> belongs to S<sub>j</sub> then  \\
           Report that S<sub>i</sub> and S<sub>j</sub> are disjoint  \\
     Endif  \\
   Endfor \\
Endfor  \\

O(n^k) Time

This running time is obtained when we're searching over all subsets of size k. Example:Finding the independent sets in a graph.

Algorithm:

for each subset S of k nodes \\ 
    check whether S constitutes an Independent set \\ 
    if S is an Independent set then \\ 
       Stop and declare success \\ 
    Endif \\
Endfor \\
if no k-node independent set was found then  \\
   Declare failure \\
Endif \\             

Beyond Polynomial Time

Example: Given a graph, find an Independent set of maximum size.
Algorithm:

for each subset S of nodes \\ 
   check whether S constitutes and independent set \\
   If S is a larger independent set than the largest seen so far, then \\ 
       Record the size of S as the current maximum \\ 
   Endif \\ 
Endfor \\ 

The running time of this brute-force algorithm is O(n²2n).

Another example is when we have to find all ways to arrange n items in order,where we end up with a running time of n!; and the Traveling Salesman Problem.

Sublinear Time

This running time appear when we have to query the input, the goal being to minimize the amount of querying that must be done.
Example: binary search. It's total running time is O(logn) as we encounter successive shrinking of the search region.

This section was easy to read, and the material within it were all familiar. I give it a 9/10.