Differences
This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| courses:cs211:winter2018:journals:mccaffreyk:home:2.3 [2018/01/23 03:26] – mccaffreyk | courses:cs211:winter2018:journals:mccaffreyk:home:2.3 [2018/01/30 04:21] (current) – mccaffreyk | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| + | ===== Chapter 2 ===== | ||
| + | |||
| + | |||
| + | ==== Section 2.1: Computational Tractability ==== | ||
| + | |||
| + | Many of the problems we will focus on have clear conditions. Efficiency is an important component of algorithms. In addition to run-time efficiency, we must also consider concepts such as space efficiency. As we have touched on already, efficient algorithms generally lack unnecessary details. Efficiency is not simply how fast an algorithm runs. There are many components to efficiency such as scale, location and how the algorithm is run. Thus, in forming a definition of efficiency we must use mathematics rather than simply relying on base intuition. Our approach to the speed component of efficiency will be the running time of the worst case. Worst-case analysis is easier to handle(less complications, | ||
| + | |||
| + | |||
| + | ==== Section 2.2: Asymptotic Order of Growth ==== | ||
| + | |||
| + | We analyze run-time in terms of steps(arithmetic operation, following a pointer, looking up an entry, or assigning a value to a variable. We will ignore the coefficients and all but the highest terms in our big-o notation. This is mainly because we want to emphasize similarity between algorithms, it is easier to derive,and at high(significant) running times, the nature of the largest term in the equation holds far more significance than the others. Asymptotic Upper Bound: T(n)=O(f(n)), | ||
| + | |||
| + | |||
| ====== Section 2.3: | ====== Section 2.3: | ||
| - | | + | This chapter discusses the importance of picking the right data structures in terms of run-time. For instance, it reviewed how linked |
| - | lists are constant for search and deletion yet not for access while arrays have constant access time but linear, O(n) addition. | + | lists are constant for search and deletion yet not for access while arrays have constant access time but linear, O(n) addition. |
| - | Overall, linked lists are therefore better for dynamic data subject to change whereas arrays are best when the data maintains its | + | Overall, linked lists are therefore better for dynamic data subject to change whereas arrays are best when the data maintains its |
| - | size. We can convert between arrays and lists in O(n) time and it is often worth it such as when the input comes in an inefficient | + | size. We can convert between arrays and lists in O(n) time and it is often worth it such as when the input comes in an inefficient |
| - | format. We apply out learnings to the stable matching algorithm. To make the algorithm O(n^2), all components in the loop must run in | + | format. We apply out learnings to the stable matching algorithm. To make the algorithm O(n^2), all components in the loop must run in |
| - | constant time. To achieve this, we can represent free men as a linked list, we build an array to represent the indexes of the next | + | constant time. To achieve this, we can represent free men as a linked list, we build an array to represent the indexes of the next |
| - | choices of women,and use an array to hold women' | + | choices of women,and use an array to hold women' |
| - | for women' | + | for women' |
| - | this section was quick and easy to read and understand. I would rate it at 8/10 to 9/10. One question I have is why the book didn't | + | this section was quick and easy to read and understand. I would rate it at 8/10 to 9/10. One question I have is why the book didn't |
| - | discuss possibly using other data structures for men and women such as stacks or dictionaries. Also, I am curious how memory would | + | discuss possibly using other data structures for men and women such as stacks or dictionaries. Also, I am curious how memory would |
| - | factor into this design in a real world scenario. | + | factor into this design in a real world scenario. |
| + | |||
| + | ==== Section 2.4: A Survey of Common Running Times ==== | ||
| | | ||
| + | This chapter discusses some common run-times for programs and the reason that they are so prevalent. These run-times include O(n), O(n^2), O(nlogn), O(n^3), | ||
| + | |||
| + | ==== Section 2.5: A More Complex Data Structure: Priority Queues | ||
| + | Sometimes, we can improve running time by using complex data structures. The priority queue is one of these. In a priority queue, each value has a key where the smallest is first priority and the largest is last priority. Priority Queues may be applied to manage the order of processes in a computer. For operations such as adding, deleting, or selecting the element with the minimum key, priority queues generally run in )(log(n)) time. Thus, extracting all minima or maxima is O(nlog(n)) time as well as using a priority queue to sort. To implement priority queues, we use the heap. Heaps work better than simpler approaches because all operations take less than O(n) time. The best way to maintain heaps is in an array of nodes with their edges in linked list form. For heaps, basic operations are in constant or O(log(n)) time. For instance, Heapify-up and Heapify-down have log(n) complexity. Change key and delete are also in O(log(n)) time. I would rate the ease of reading for this section at 6/10. I feel like its descriptions were overly complicated and understanding these concepts was far easier with the material from class. | ||
