# Foundations of CS II

• ## Related Material

DPV Algorithms Book (for the last 9-10 lectures)

## Final Exam: What to Know

Posted by James Lee on December 7, 2010

The final exam will take place Monday, December 13th, in class, from 2:30-4:20pm. Just like lecture, it’s in room EEB 037.

As I said in class, the final exam will be open notes. Please limit yourself to one notebook worth of notes, along with things like the homework solutions and/or class readings, if you wish.

Here is a list of what you should know/be able to do for the final exam.

• Counting: Permutations and Combinations
• The axioms of discrete probability and how they can be used
to derive various elementary properties (e.g. inclusion-exclusion)

• Conditional probabilities, the chain rule
• Bayes Theorem, basic Bayesian inference (e.g. the robot problem)
• The definition of independence, and the properties of independent events
• The definition of a random variable and the probability mass function
• Expectation of a random variable
• Variance of a random variable
• Various types of discrete random variables:
Uniform, Binomial, Poisson

• What tail bounds are useful for. Markov’s inequality and Chebyshev’s inequality
• The basics of continuous random variables; the probability density function and the cumulative distribution function
• The exponential and normal distributions
• The statement of the Central Limit Theorem; how it can be used
• Confidence intervals, parameter estimation
• The method of moments estimator
• Maximum-Likelihood estimation
• The definition of polynomial-time
• The basics of divide-and-conquer algorithms (be familiar with the lecture material, plus the homework problems)
• The basics of dynamic programming (same thing)

## Lecture 27: Polynomial-time reductions

Posted by James Lee on December 7, 2010

## Lecture 25: Dynamic Programming

Posted by James Lee on December 6, 2010

## Homework #8: Algorithms

Posted by James Lee on December 1, 2010

Due, in class: December 8th.

In solving the problem sets, you are allowed to collaborate with fellow students taking the class, but remember that you are required to write up the solutions by yourself or with a single partner. If you do collaborate in any way, you must acknowledge, for each problem, the people you worked with on that problem.

The problems have been carefully chosen for their pedagogical value, and hence might be similar to those given in past offerings of this course at UW, or similar to other courses at other schools. Using any pre-existing solutions from these sources, for from the web, constitutes a violation of the academic integrity you are expected to exemplify, and is strictly prohibited.

Most of the problems only require one or two key ideas for their solution. It will help you a lot to spell out these main ideas so that you can get most of the credit for a problem even if you err on the finer details.

A final piece of advice: Start working on the problem sets early! Don’t wait until the day (or few days) before they’re due.

1. ## Problem 1.

[Hint: Try to setup a stable matching instance that will solve the problem.]

2. ## Problem 2: More inversions

Recall the problem of finding the number of inversions. As in the lecture, we are given a sequence of n numbers $a_1, a_2, \ldots, a_n$, which we assume are all distinct, and we define an inversion to be a pair $i < j$ such that $a_i > a_j$.

We motivated the problem of counting inversions as a good measure of how different two orderings are. However, one might feel that this measure is too sensitive. Let’s call a pair of significant inversion if $i < j$ and $a_i > 2 a_j$. Given an $O(n \log n)$-time algorithm to count the number of significant inversions between two orderings.

## Lecture 24: Divide & Conquer

Posted by James Lee on November 30, 2010

## Homework #7: Algorithms intro

Posted by James Lee on November 25, 2010

Due, in class: December 1st.

In solving the problem sets, you are allowed to collaborate with fellow students taking the class, but remember that you are required to write up the solutions by yourself or with a single partner. If you do collaborate in any way, you must acknowledge, for each problem, the people you worked with on that problem.

The problems have been carefully chosen for their pedagogical value, and hence might be similar to those given in past offerings of this course at UW, or similar to other courses at other schools. Using any pre-existing solutions from these sources, for from the web, constitutes a violation of the academic integrity you are expected to exemplify, and is strictly prohibited.

Most of the problems only require one or two key ideas for their solution. It will help you a lot to spell out these main ideas so that you can get most of the credit for a problem even if you err on the finer details.

A final piece of advice: Start working on the problem sets early! Don’t wait until the day (or few days) before they’re due.

## Problems

These three problems are taken from the Dasgupta-Papadimitriou-Vazirani Algorithms book. You can find it here.

1. DPV book, Problem 2.4
2. DPV book, Problem 2.12
3. DPV book, Problem 2.16

## Lecture 23: A theory of efficiency

Posted by James Lee on November 24, 2010

## Lecture 22: Intro to Algorithms

Posted by James Lee on November 21, 2010

## Reading for the rest of the quarter

Posted by James Lee on November 20, 2010

In the rest of the quarter, we’ll see some basic algorithmic ideas, and also the basic theory of intractability and NP-completeness.

11/22 and 11/24 – Divide & conquer algorithms

Reading: Sections 2.1-2.5 of DPV.

11/29 and 12/1 – Dynamic Programming

Reading: Sections 6.1-6.4 of DPV.

12/3, 12/6, 12/8 – NP-completeness and reductions

Reading: Chapter 8 of DPV.

Looking to the future…

If you like algorithms, then you’ll want to take CSE 421 in the Winter quarter (taught by Anup Rao). If you like complexity theory (the study of problems that are hard for computers to solve), then you’ll want to take CSE 431 in the Spring quarter (taught by me). There is also the Machine Learning course CSE 446 taught next quarter by Oren Etzioni.

## Lecture 21: Randomized minimum cuts

Posted by James Lee on November 20, 2010

Today we talked about the following problem: Given an undirected input graph $G=(V,E)$, find a smallest subset of edges $E' \subseteq E$ such that $G'=(V,E-E')$ is disconnected. We called this the Minimum Cut Problem.

We discussed a randomized algorithm.

Algorithm RMinCut

While there are more than 2 nodes remaining

1. Choose an edge e at random.

2. Contract e.

When there are exactly 2 nodes left, output all the edges between the two nodes.

Recall that we maintain a multigraph during the execution, i.e. our graph will eventually have multiple edges between pairs of nodes.

Analysis:

Let $C^* = \{c_1, c_2, \ldots, c_k\}$ be some fixed minimum cut in $G$. Our algorithm will output $C^*$ as long as no edge of $C^*$ is ever contracted. Let’s calculate the probability of this happening.

First, during the execution of the algorithm, every node must have degree at least $k$ (the size of the minimum cut), else cutting out that node will result in a cut of size less than $k$.

Therefore the probability that the i-th edge contracted (i=0,1,2,…) is in $C^*$ can be computed:

$\displaystyle \Pr[e \in C_k] \leq \frac{k}{|E|} \leq \frac{2}{n-i},$

where we recall that $|E| \geq k |V| /2 \geq k(n-i)/2$ at the i-th stage.

So the probability that no edge of $C^*$ is contracted is at least

$\displaystyle \left(1-\frac{2}{n}\right) \left(1-\frac{2}{n-1}\right) \cdots \left(1-\frac{2}{4}\right) \left(1-\frac{2}{3}\right) = \left(\frac{n-2}{n}\right) \left(\frac{n-3}{n-1}\right) \left(\frac{n-4}{n-2}\right) \cdots \left(\frac{3}{5}\right)\left(\frac{2}{4}\right) \left(\frac{1}{3}\right).$

This is a telescoping product which equals $\frac{2}{n(n-1)}$.

Thus $\Pr[\textrm{find } C^*] \geq \frac{2}{n(n-1)}$. Now running the algorithm $O(n^2)$ times ensures that with probability at least 90%, we find the actual min-cut.