Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
courses:cs211:winter2018:journals:martinj:2.4 [2018/01/30 03:21] – created martinjcourses:cs211:winter2018:journals:martinj:2.4 [2018/01/30 03:51] (current) martinj
Line 1: Line 1:
-====== 2.4 ======+====== 2.4: Runtimes ====== 
 +This Chapter went into further detail about some common running times, including:
  
 +__**Linear time O(n)**__
  
-====== 2.5 ======+This running time is at most a constant factor times the size of the input. Common problems that have this running time include computing the maximum and merging two sorted lists
  
  
-====== 3.1 ======+__**O(nlogn) Time**__
  
-Brief summary of the section +This is the running time of any algorithm that splits its inputs into two equal-sized pieces, solves each piece recursively, and then combines the two solutions in linear time. 
-~1 paragraph of about 5-10 sentences per section + 
-feel free to write more if that will help you +  
-Include motivations for the given problemas appropriate +__**Quadratic Time**__ 
-For algorithmsbrief sketch of algorithmintuitionand implementation + 
-Include runtime for algorithms +Example for this: you're given a certain number of points on a planeand you want to find which pair is closest together. Each time you measure a line between two points, you can skip out on doing it in the opposite direction. 
-Questions you have about motivation/solution/proofs/analysis + 
-Discuss anything that makes more sense after reading it againafter it was presented in class (or vice versa) + 
-Anything that you want to rememberanything that will help you +__**Cubic Time**__ 
-Say something about how readable/interesting the section was on scale of 1 to 10+ 
 +From what I can tell, only happens with triple-nested loopseach of which operates at a runtime of O(n). They compound to make the runtime cubic. 
 + 
 + 
 +__**O(n^k) Time**__ 
 + 
 +We obtain a runtime of O(n^k) for any constant k when we search over all subsets of size k. 
 + 
 + 
 +__**Bigger than Polynomial**__ 
 + 
 +Things that are order O(n!)such as finding the number of ways to match up n items with n other itemsare even more costly than functions of an order like O(n^2)  
 + 
 + 
 +__**Smaller than Linear**__ 
 + 
 +Binary search algorithm is O(logn). This runtime arises when we're dealing with an algorithm that does a constant amount of work in order to throw away a constant fraction of the inputs. 
 + 
 + 
 + 
 +This chapter was very straightforward. The only algorithms it detailed were to demonstrate runtimes of simple algorithms such as those mentioned above. I don'have any questions about it and I think learning this will just be very straightforward and memorization-focused.  
 +====== 2.5: Priority Queues ====== 
 +This section focused on priority queues. It started with a general introduction of the data structure, which includes elements that each have priority values (keys). This allows us to select the elements in the queue by order of priority. It allows additiondeletion, and the selection of the element with the smallest key in O(logntime each. The chapter then goes on to introduce the heapwhich is a data structure for implementing a priority queue. It is a sort of balanced binary tree with a root and nodes that can each have up to two children. The keys are in "heap order" if the key of any element is at least as large as the key of the element at its parent node. The chapter then goes into heap operations, including heapify-up, which allows us to insert a new element in a heap of n elements in O(logn) time. It looks like... 
 + 
 +    Heapify-up(H,i): 
 +        If i>1 then 
 +            let j=parent(i)=[i/2] 
 +            If key[H[i]]<key[H[j]] then 
 +                swap H[i] and H[j] 
 +                Heapify-up(H,j) 
 +            Endif 
 +        Endif 
 + 
 + 
 +It also includes heapify-down, using which we can delete a new element in a heap of n elements in O(logn) time. The algorithm for this function can be found on page 63 in the textbook, and seems to be relatively intuitive once you get the hang of heapify-up. Near the end of this section is a list of other operations that we can create using the heap data structure and its heapify-down and heapify-up algorithms. 
 + 
 + 
 + 
 +====== 3.1: Graphs ====== 
 +This section obviously focuses on graphs. The first couple of pages simply lists out common examples of graphs in our everyday life. They seem to be everywhere! Very promising and exciting! Everything discussed about graphs in this section was extremely basic, and relates directly back to the discrete math class I had last semester. All of the terms such as node, path, directionality, and connectivity are defined the same, and there is nothing about them that isn't very intuitive for me. For that reason, I don't see any real definitions or algorithms that are noteworthy about this chapter. It is just a fun and fluffy intro to graphs. 
courses/cs211/winter2018/journals/martinj/2.4.1517282505.txt.gz · Last modified: by martinj
CC Attribution-Noncommercial-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0