<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="http://servo.ad.wlu.edu/dokuwiki/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="http://servo.ad.wlu.edu/dokuwiki/feed.php">
        <title>W&amp;L Computer Science Wiki - courses:cs211:winter2012:journals:ian</title>
        <description></description>
        <link>http://servo.ad.wlu.edu/dokuwiki/</link>
        <image rdf:resource="http://servo.ad.wlu.edu/dokuwiki/lib/exe/fetch.php/wiki/dokuwiki-128.png" />
       <dc:date>2026-04-20T00:55:41+00:00</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter1?rev=1326859060&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter2?rev=1327442075&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter3?rev=1328065893&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter4?rev=1331096887&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter5?rev=1331698081&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter6?rev=1332908829&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter7?rev=1333512367&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/home?rev=1333510163&amp;do=diff"/>
                <rdf:li rdf:resource="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/preface?rev=1326837239&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="http://servo.ad.wlu.edu/dokuwiki/lib/exe/fetch.php/wiki/dokuwiki-128.png">
        <title>W&L Computer Science Wiki</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/</link>
        <url>http://servo.ad.wlu.edu/dokuwiki/lib/exe/fetch.php/wiki/dokuwiki-128.png</url>
    </image>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter1?rev=1326859060&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-01-18T03:57:40+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter1</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter1?rev=1326859060&amp;do=diff</link>
        <description>Chapter 1

1.1: Stable Matching

	*  There are many applications in which one wants to produce a self-enforcing matching algorithm, such that no unmatched pair would mutually prefer breaking the existing pair and pairing with each other. This problem is most readily approached as a matching of n parties to n parties, even though most real-world situations are n×m (reflecting a general intuition that it is often easiest to approach a hard problem by first solving related but simpler problems and …</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter2?rev=1327442075&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-01-24T21:54:35+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter2</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter2?rev=1327442075&amp;do=diff</link>
        <description>Chapter 2

2.1 Computational Tractability

	*  Comparison of algorithms makes necessary some measure of efficiency. This must be abstract for convenience in comparison, but should still be applicable to normal real-world cases. The best known general-purpose measure is the worst-case running time, as average case has its uses but is rarely well-defined. The natural comparison is brute-force search; since brute-force typically takes exponential time, efficiency can be defined as the next-smallest…</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter3?rev=1328065893&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-02-01T03:11:33+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter3</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter3?rev=1328065893&amp;do=diff</link>
        <description>Chapter 3

3.1 Basic Definitions and Applications

	*  A graph is a connection of nodes and edges, which may be either directed or undirected; this can model a variety of complexes and relationships, especially networks (whether physical or notional); trees are graphs that do not contain cycles. A typical operation on a graph is traversal between nodes along edges, and more generally the search for possible and optimal traversals.</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter4?rev=1331096887&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-03-07T05:08:07+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter4</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter4?rev=1331096887&amp;do=diff</link>
        <description>Chapter 4

4.1 Interval Scheduling

	*  A greedy algorithm is one that involves no backtracking or other global optimization--it builds the solution by a simple metric and does not rethink what it has already done. Interval scheduling is easily solved by adding the first to end of the tasks that do not conflict with any other until there are no more such. It can be proven optimal by</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter5?rev=1331698081&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-03-14T04:08:01+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter5</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter5?rev=1331698081&amp;do=diff</link>
        <description>5.1 The Mergesort Algorithm

	*  Mergesort sorts a list by dividing it in half, recursively sorting the halves, and then merging the halves back together, exploiting the fact that it is easy to combine sorted lists into a sorted list, and that if you divide a list often enough the parts become sorted automatically (becoming singleton). Given any algorithm of this sort, in which the data is divided in half and then combined in linear time, we have T(n)=&lt;2T(n/2)+cn, and T(2)=&lt;c. The sum of this fu…</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter6?rev=1332908829&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-03-28T04:27:09+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter6</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter6?rev=1332908829&amp;do=diff</link>
        <description>6.1 Weighted Interval Scheduling

	*  A greedy algorithm can efficiently solve unweighted interval scheduling, but when the intervals are weighted it fails: the greedy algorithm may choose a task that precludes a later, heavily weighted task. We can solve this without resorting to BFI by limited backtracking: again go through the list by order of finish time, but now instead of committing to schedule available tasks recursively consider the remaining possibilities conditional on including and ex…</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter7?rev=1333512367&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-04-04T04:06:07+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>chapter7</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/chapter7?rev=1333512367&amp;do=diff</link>
        <description>7.1 Maximum Flow and Ford Fulkerson

	*  The problem is to find a set of flows along directed edges with a capacity constraint that maximizes flow between two nodes. Assume integer capacities, no edges entering source or leaving sink, and connected. The sum of flows out of a node must equal the sum of flows in; since all flow must pass somewhere, any cut separating the source and sink bounds the capacity above. Cannot use any obvious greedy algorithm, as a misdirected flow can cut off future pat…</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/home?rev=1333510163&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-04-04T03:29:23+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>home</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/home?rev=1333510163&amp;do=diff</link>
        <description>Ian&#039;s Journal

	*  Preface
	*  Chapter 1: Introduction: Some Representative Problems
	*  Chapter 2: Basics of Algorithm Analysis
	*  Chapter 3: Graphs
	*  Chapter 4: Greedy Algorithms
	*  Chapter 5: Divide and Conquer
	*  Chapter 6: Dynamic Programming
	*  Chapter 7: The Ford-Fulkerson Algorithm</description>
    </item>
    <item rdf:about="http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/preface?rev=1326837239&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2012-01-17T21:53:59+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>preface</title>
        <link>http://servo.ad.wlu.edu/dokuwiki/doku.php/courses/cs211/winter2012/journals/ian/preface?rev=1326837239&amp;do=diff</link>
        <description>Preface

	*  Algorithms, and the insight afforded by their formal study, are widely useful. In particular, algorithms are the core of CS once one abstracts the messier details of interface and low-level implementation. Therefore, algorithms are more suitable to formal analysis than any superset of CS, while remaining sufficiently general that observations about algorithms are insightful into the whole.</description>
    </item>
</rdf:RDF>
