Table of Contents

Chapter 2

2.1: Computational Tractability

The goal of this course is to write correct and efficient algorithms and this sections goes through how we define efficiency.

Intial attempt at defining efficiancy: “An algorithm is efficient if when, when implemented, it runs quickly on real input instances” What works:

What doesn't work:

Worst-Case Running Times and Brute-Force Search: “An algorith is efficient if it achieves qualitatviely better worst-case performance, at an analytical level, than brute-force search” Why it works:

Why it doesn't works:

Polynomial Time as a Definition of Efficiency “An algorithm is efficient if it has a polynomial running time.” Why it works:

Why it doesn't work

2.2 Asymptotic Order of Growth

This section discuss how to measure the order of an algorithm and looks at how to find a lower bound, an upper bound, and a tight bound. Our goal is to express the growth rate of running times of algorithms.

Upper bounds:

Lower bound:

Tight bounds:

Properties of Asymptotic Growth Rates

1. Transitivity: 
  * If f = O(g) and g = O(h) then f = O(h)
  * If f = Ω(g) and g = Ω(h) then f = Ω(h)
  * If f = Θ(g) and g = Θ(h) then f = Θ(h)
2. Sum of Functions
  * Suppose that f and g are two functions such that for some other function h, we have f=O(h) and g = O(h) then f+g=O(h)
  * The same can be said for a finite number of functions
  * g = O(f) then f+g = O(f)

Asymptotic bounds for the some common fuctions

Helpful to have the bounds for common functions, but I am wondering how will we find them in the future?

2.3: Implementing the Stable Matching Algorithm Using Lists and Arrays

Moving from general discussion of GS algorithm to actual implementation. First need to figure out what Data Structures to use.

Arrays:

Lists:

We will need to use both in the stable matching alogorithm

This gives O(n^2) running time.

Getting into this stuff is more difficult for me. I don't remember running times as much and I try to think of it logically and actually go through the process, but it doesn't always work. Any step by step process would be helpful, but I am guessing it's something you get better at with time.

2.4: A Survey of Common Running Times

This section was easier to read since we just spent so much time on it and I felt pretty comfortable with it.

Linear time

O(n*logn) time

Quadratic time:

Cubic time

O(n^k)

Beyond polynomial time

Sublinear time

This all makes sense to me, I think the biggest struggle will be when I need to design an alogirthm with a certain running time, but these are the notes I should look at!

2.5: A More Complex Data Structure Priority Queues

Priority Queue:

Use a heap to implement a priority queue

Use these operations:

I think heaps are soooo fun! I enjoyed this chapter because it layed out the code in a pretty readabe way.