====== 2.4: Runtimes ====== This Chapter went into further detail about some common running times, including: __**Linear time O(n)**__ This running time is at most a constant factor times the size of the input. Common problems that have this running time include computing the maximum and merging two sorted lists. __**O(nlogn) Time**__ This is the running time of any algorithm that splits its inputs into two equal-sized pieces, solves each piece recursively, and then combines the two solutions in linear time. __**Quadratic Time**__ Example for this: you're given a certain number of points on a plane, and you want to find which pair is closest together. Each time you measure a line between two points, you can skip out on doing it in the opposite direction. __**Cubic Time**__ From what I can tell, only happens with triple-nested loops, each of which operates at a runtime of O(n). They compound to make the runtime cubic. __**O(n^k) Time**__ We obtain a runtime of O(n^k) for any constant k when we search over all subsets of size k. __**Bigger than Polynomial**__ Things that are order O(n!), such as finding the number of ways to match up n items with n other items, are even more costly than functions of an order like O(n^2) __**Smaller than Linear**__ Binary search algorithm is O(logn). This runtime arises when we're dealing with an algorithm that does a constant amount of work in order to throw away a constant fraction of the inputs. This chapter was very straightforward. The only algorithms it detailed were to demonstrate runtimes of simple algorithms such as those mentioned above. I don't have any questions about it and I think learning this will just be very straightforward and memorization-focused. ====== 2.5: Priority Queues ====== This section focused on priority queues. It started with a general introduction of the data structure, which includes elements that each have priority values (keys). This allows us to select the elements in the queue by order of priority. It allows addition, deletion, and the selection of the element with the smallest key in O(logn) time each. The chapter then goes on to introduce the heap, which is a data structure for implementing a priority queue. It is a sort of balanced binary tree with a root and nodes that can each have up to two children. The keys are in "heap order" if the key of any element is at least as large as the key of the element at its parent node. The chapter then goes into heap operations, including heapify-up, which allows us to insert a new element in a heap of n elements in O(logn) time. It looks like... Heapify-up(H,i): If i>1 then let j=parent(i)=[i/2] If key[H[i]]