Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
courses:cs211:winter2011:journals:anna:chapter_5 [2011/03/08 21:43] poblettsacourses:cs211:winter2011:journals:anna:chapter_5 [2011/03/15 14:22] (current) poblettsa
Line 20: Line 20:
  
 The last recurrence relation is when T(n) ≤ 2 T(n/2) + O(n^2) when it takes quadratic time to split up and combine the halves.  This section was pretty confusing and abstract to me.  I think concrete examples would help me a lot with these recurrence relations. The last recurrence relation is when T(n) ≤ 2 T(n/2) + O(n^2) when it takes quadratic time to split up and combine the halves.  This section was pretty confusing and abstract to me.  I think concrete examples would help me a lot with these recurrence relations.
 +
 +=== 5.3 Counting Inversions ===
 +
 +This problem arises when you are comparing two sets of rankings, for example to determine how close two people's movie preferences are.  The best way to quantify things like this is to look at two people's rankings and see how "out of order" they are.  This works by counting inversions, which is when i < j in one person's rankings but j < i in the other person's.  Therefore, we want an algorithm that will count the number of inversions.  
 +
 +The brute force approach take O(n^2) time to check all pairs of numbers and see if they are an inversion.  However, we can do better than this.  We can use the divide and conquer approach to split the list into 2 lists of n/2 values, then find the inversions in each side.  The trick is to be able to look at the inversions between the two halves in linear time.  If we make the recursive step sort in addition to counting, it is a lot easier to see inversions between the two lists.  We can do this with a pointer in each list, comparing the pointers, inserting the smaller one, and updating one of the pointers, which is similar to mergesort.  We can easily count the inversions as we go because we know if the pointer in the second list is smaller then it is inverted with everything after the pointer in the first list. The algorithm for this is on pages 224-225 and runs in O(n logn) time as we hoped.
 +
 +=== 5.4 Finding the Closest Pair of Points ===
 +
 +This problem is simply to find the closest two points in a plane with n points.  The brute force solution is to look at all pairs of points, which takes O(n^2) time.  Assuming no two points have the same x-coordinate, you can order the points by their x-coordinate and draw a line to divide the points in half.  Then you can find the smallest distance between two points on the same side.  The tricky part is looking at the distances between points on opposite sides.  If d is the minimum of the shortest distances on either side, then you only need to look at points within d of the center line when looking at distances between the two sides. 
 +
 +In this step, it turns out we can bound the number of comparisons to 12 (to start with).  If we divide the 2d area into boxes of 1/2d we can first of all guarantee that there will only be one point per box.  Then we only need to look at boxes that are within 2 of the point.  This limits us to 12.  We can do even better by eliminating the boxes on the same side of the dividing line, which brings us down to a max of 7 comparisons.  Thus, our algorithm, on page 230, will run in O(n logn) time. 
 +
 +
 +=== 5.5 Integer Multiplication === 
 +
 +The problem in this section is that of basic integer multiplication.  However, when you look at the details of what you do when you multiply 2 n digit integers, you see that you take a lot of small partial products which result in a running time of O(n^2).  The main objective of this section is to determine if we actually need all these partial products, because if we don't, we would be able to reduce the running time.  If we break up each integer(so we would have 4 n/2 digit numbers), we could do four partial products to determine x*y, as on page 232.  Unfortunately, it turns out this still runs in quadratic time like normal integer multiplication.  
 +
 +Therefore, we want to only do three partial products if we want to reduce the running time of our algorithm.  This would mean we would have a recurrence relation with q = 3, which would give a running time of O(n^1.59).  The key to this algorithm is to look at (x1 + x0)(y1 + y0), where x is the first integer, y is the second integer, 0 refers to the high-end of the number, and 1 refers to the low-end of the number.  The solution to this multiplication is four partial products added together with only one recursive call, making it easier to get the partial products we need.  This algorithm is on page 233.  
  
courses/cs211/winter2011/journals/anna/chapter_5.1299620610.txt.gz · Last modified: 2011/03/08 21:43 by poblettsa
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0