Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision |
courses:cs211:winter2011:journals:camille:chapter3 [2011/01/31 04:09] – [Section 4: Testing Bipartiteness: An Application of Breadth-First Search] cobbc | courses:cs211:winter2011:journals:camille:chapter3 [2011/02/12 14:28] (current) – [Section 6: Directed Acyclic Graphs and Topological Ordering] cobbc |
---|
| |
* **Summary:** | * **Summary:** |
A bipartite graph is one where the nodes can be partitioned into two sets X and Y such that every node has one end in X and one end in Y. Imagine X = red and Y = blue. What are natural examples of nonbipartite graphs where no partition is possible? | A bipartite graph is one where the nodes can be partitioned into two sets X and Y such that every node has one end in X and one end in Y. Imagine X = red and Y = blue. What are natural examples of nonbipartite graphs where no partition is possible? A bipartite graph cannot contain an odd-length cycle. How do you recognize if a graph is bipartite? Well, to we use a BFS to color the graph. We start with one node (any node) and color it red (could be blue, it doesn't matter). Then we color all of the nodes connected to it blue (i.e. the first layer in BFS), then we color all of the nodes connected to those nodes red (layer 2), etc. until we finish the BFS. This doesn't take any more time than BFS so it's O(m+n). Then the book goes through a proof of how this determines whether or not the graph is bipartite -> if there's an edge between nodes of the same layer it is not bipartite, but if there are no edges between nodes in the same layer then it is bipartite. |
| |
* **Insights and Questions:** | * **Insights and Questions:** |
| We said that bipartite graphs apply to stable matching situations, but what else can they describe? |
| |
* **Readability:** | * **Readability:** |
| Very readable. And super short :) yay! Kind of confusing that it's an "application of BFS" but it also seems like something "big" itself. I think the title almost sells it short. |
===== Section 5: Connectivity in Directed Graphs ===== | ===== Section 5: Connectivity in Directed Graphs ===== |
| |
* **Summary:** | * **Summary:** |
| This section goes into the similarities/differences between directed and undirected graphs (and focuses on directed). Directed graphs are represented much like undirected with an adjacency list, but instead of every node having a list of its neighbors, it has a list of nodes it has out edges to and a list of nodes it has in edges to. BFS and DFS work almost exactly the same as in undirected graphs, but we just discover edges that a node has a to-edge to. The algorithm still performs in O(m+n) time. But all of the paths discovered by BFS (or DFS) don't necessarily have paths back to the starting node in this case. You can also do a reverse BFS (or DFS) on the graph by discovering the nodes that a node has a from-edge to. Then the book goes into strong connectivity, which means that for every pair of nodes u and v, there's a path from u to v and from v to u (we call this property "mutually reachable"). If u and v are mutually reachable and v and w are mutually reachable, then u and w are mutually reachable (book has a proof, but it's pretty obvious). This lets us say that if we do a BFS on the in edges and a BFS on the out edges, the nodes that are found in both searches are the "strong component." Also, just like for any two nodes s and t, their connected components are either identical or disjoint, the strong component of two nodes s and t in a directed graph either have disjoint or identical strong components (book gives a proof p.99). |
| |
* **Insights and Questions:** | * **Insights and Questions:** |
| None, really (sorry). |
| |
* **Readability:** | * **Readability:** |
| Makes a lot of sense. It was nice that this was JUST an explanation of directed graphs. Makes me wonder why they discussed them so much earlier, but I guess it was a good setup. |
===== Section 6: Directed Acyclic Graphs and Topological Ordering ===== | ===== Section 6: Directed Acyclic Graphs and Topological Ordering ===== |
| |
* **Summary:** | * **Summary:** |
| Undirected graphs without cycles are really simple (connected components = trees), but it's possible for directed graphs to be "rich" (i.e. complicated) without having any (directed) cycles. A directed graph without cycles is a directed acyclic graph (DAG). DAGs can be used to encode precedence relations or dependencies. Ex. tasks that depend on each other where nodes are the tasks and the directed edge from node i to j (i, j) means that i must be done before j. If there's a cycle, ordering of those tasks doesn't make any sense (everything in the cycle has to be done before everything else). A valid ordering of the "tasks"/nodes in a DAG is called a topological ordering. The tasks are "lined up" and all of the edges "point forward." Then the book proves (p. 101) that if G has a topological ordering, it's a DAG. Does the reverse work? And how do you find a DAG's topological ordering? Yes, the reverse does work. The first node in the ordering has no incoming edges (proof: any DAG has at least one node without incoming edges). Put that node first and delete it from the graph. Recursively compute a topological ordering from the leftover graph. Book goes into a more rigorous explanation. |
| |
* **Insights and Questions:** | * **Insights and Questions:** |
| The difference that they pointed out between directed and undirected graphs (that directed graphs with the same total number of edges as a "corresponding" undirected graph can be much more complicated) is really interesting. I guess it makes sense, but I hadn't really thought about comparing them in that way before. |
| |
* **Readability:** | * **Readability:** |
| As always, this section was very readable. I am glad I read it after we went over it in class, though, mostly because the word "topological" is kind of scary (haha ...). |