cphb/chapter24.tex

690 lines
21 KiB
TeX
Raw Normal View History

2016-12-28 23:54:51 +01:00
\chapter{Probability}
2017-01-15 12:26:21 +01:00
\index{probability}
2017-02-11 12:13:09 +01:00
A \key{probability} is a real number between $0$ and $1$
2017-01-15 12:26:21 +01:00
that indicates how probable an event is.
If an event is certain to happen,
its probability is 1,
and if an event is impossible,
its probability is 0.
The probability of an event is denoted $P(\cdots)$
2017-02-11 12:13:09 +01:00
where the three dots describe the event.
2017-01-15 12:26:21 +01:00
For example, when throwing a dice,
2017-02-11 12:13:09 +01:00
the outcome is an integer between $1$ and $6$,
2017-05-09 22:32:59 +02:00
and the probability of each outcome is $1/6$.
2017-02-11 12:13:09 +01:00
For example, we can calculate the following probabilities:
\begin{itemize}[noitemsep]
2017-04-21 08:30:15 +02:00
\item $P(\textrm{''the outcome is 4''})=1/6$
\item $P(\textrm{''the outcome is not 6''})=5/6$
\item $P(\textrm{''the outcome is even''})=1/2$
2017-02-11 12:13:09 +01:00
\end{itemize}
2017-01-15 12:26:21 +01:00
\section{Calculation}
2017-02-11 12:13:09 +01:00
To calculate the probability of an event,
we can either use combinatorics
or simulate the process that generates the event.
As an example, let us calculate the probability
2017-01-15 12:26:21 +01:00
of drawing three cards with the same value
from a shuffled deck of cards
2017-02-11 12:13:09 +01:00
(for example, $\spadesuit 8$, $\clubsuit 8$ and $\diamondsuit 8$).
2017-01-15 12:26:21 +01:00
\subsubsection*{Method 1}
2017-02-11 12:13:09 +01:00
We can calculate the probability using the formula
2017-01-15 12:26:21 +01:00
2017-02-11 12:13:09 +01:00
\[\frac{\textrm{number of desired outcomes}}{\textrm{total number of outcomes}}.\]
2017-01-15 12:26:21 +01:00
2017-02-11 12:13:09 +01:00
In this problem, the desired outcomes are those
2017-01-15 12:26:21 +01:00
in which the value of each card is the same.
2017-02-11 12:13:09 +01:00
There are $13 {4 \choose 3}$ such outcomes,
2017-01-15 12:26:21 +01:00
because there are $13$ possibilities for the
value of the cards and ${4 \choose 3}$ ways to
choose $3$ suits from $4$ possible suits.
2017-02-11 12:13:09 +01:00
There are a total of ${52 \choose 3}$ outcomes,
2017-01-15 12:26:21 +01:00
because we choose 3 cards from 52 cards.
Thus, the probability of the event is
2016-12-28 23:54:51 +01:00
\[\frac{13 {4 \choose 3}}{{52 \choose 3}} = \frac{1}{425}.\]
2017-01-15 12:26:21 +01:00
\subsubsection*{Method 2}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
Another way to calculate the probability is
to simulate the process that generates the event.
2017-05-09 22:32:59 +02:00
In this example, we draw three cards, so the process
2017-01-15 12:26:21 +01:00
consists of three steps.
2017-05-09 22:32:59 +02:00
We require that each step of the process is successful.
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
Drawing the first card certainly succeeds,
2017-02-11 12:13:09 +01:00
because there are no restrictions.
2017-01-15 12:26:21 +01:00
The second step succeeds with probability $3/51$,
because there are 51 cards left and 3 of them
have the same value as the first card.
2017-02-11 12:13:09 +01:00
In a similar way, the third step succeeds with probability $2/50$.
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
The probability that the entire process succeeds is
2016-12-28 23:54:51 +01:00
\[1 \cdot \frac{3}{51} \cdot \frac{2}{50} = \frac{1}{425}.\]
2017-01-15 12:26:21 +01:00
\section{Events}
2016-12-28 23:54:51 +01:00
2017-05-09 22:32:59 +02:00
An event in probability theory can be represented as a set
2016-12-28 23:54:51 +01:00
\[A \subset X,\]
2017-02-11 12:13:09 +01:00
where $X$ contains all possible outcomes
2017-01-15 12:26:21 +01:00
and $A$ is a subset of outcomes.
For example, when drawing a dice, the outcomes are
2017-02-11 12:13:09 +01:00
\[X = \{1,2,3,4,5,6\}.\]
2017-04-21 08:30:15 +02:00
Now, for example, the event ''the outcome is even''
2017-01-15 12:26:21 +01:00
corresponds to the set
2017-02-11 12:13:09 +01:00
\[A = \{2,4,6\}.\]
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
Each outcome $x$ is assigned a probability $p(x)$.
2017-05-30 18:59:15 +02:00
Then, the probability $P(A)$ of an event
2017-05-09 22:32:59 +02:00
$A$ can be calculated as a sum
2017-01-15 12:26:21 +01:00
of probabilities of outcomes using the formula
2016-12-28 23:54:51 +01:00
\[P(A) = \sum_{x \in A} p(x).\]
2017-01-15 12:26:21 +01:00
For example, when throwing a dice,
$p(x)=1/6$ for each outcome $x$,
2017-02-11 12:13:09 +01:00
so the probability of the event
2017-04-21 08:30:15 +02:00
''the outcome is even'' is
2017-02-11 12:13:09 +01:00
\[p(2)+p(4)+p(6)=1/2.\]
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
The total probability of the outcomes in $X$ must
be 1, i.e., $P(X)=1$.
2016-12-28 23:54:51 +01:00
2017-05-09 22:32:59 +02:00
Since the events in probability theory are sets,
2017-01-15 12:26:21 +01:00
we can manipulate them using standard set operations:
2016-12-28 23:54:51 +01:00
\begin{itemize}
2017-01-15 12:26:21 +01:00
\item The \key{complement} $\bar A$ means
2017-02-11 12:13:09 +01:00
''$A$ does not happen''.
2017-01-15 12:26:21 +01:00
For example, when throwing a dice,
2017-02-11 12:13:09 +01:00
the complement of $A=\{2,4,6\}$ is
$\bar A = \{1,3,5\}$.
2017-01-15 12:26:21 +01:00
\item The \key{union} $A \cup B$ means
''$A$ or $B$ happen''.
For example, the union of
2017-02-11 12:13:09 +01:00
$A=\{2,5\}$
and $B=\{4,5,6\}$ is
$A \cup B = \{2,4,5,6\}$.
2017-01-15 12:26:21 +01:00
\item The \key{intersection} $A \cap B$ means
''$A$ and $B$ happen''.
For example, the intersection of
2017-02-11 12:13:09 +01:00
$A=\{2,5\}$ and $B=\{4,5,6\}$ is
$A \cap B = \{5\}$.
2016-12-28 23:54:51 +01:00
\end{itemize}
2017-01-15 12:26:21 +01:00
\subsubsection{Complement}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
The probability of the complement
$\bar A$ is calculated using the formula
2016-12-28 23:54:51 +01:00
\[P(\bar A)=1-P(A).\]
2017-01-15 12:26:21 +01:00
Sometimes, we can solve a problem easily
2017-02-11 12:13:09 +01:00
using complements by solving the opposite problem.
2017-01-15 12:26:21 +01:00
For example, the probability of getting
at least one six when throwing a dice ten times is
2016-12-28 23:54:51 +01:00
\[1-(5/6)^{10}.\]
2017-02-11 12:13:09 +01:00
Here $5/6$ is the probability that the outcome
2017-01-15 12:26:21 +01:00
of a single throw is not six, and
$(5/6)^{10}$ is the probability that none of
the ten throws is a six.
2017-02-18 15:52:02 +01:00
The complement of this is the answer to the problem.
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
\subsubsection{Union}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
The probability of the union $A \cup B$
is calculated using the formula
2016-12-28 23:54:51 +01:00
\[P(A \cup B)=P(A)+P(B)-P(A \cap B).\]
2017-01-15 12:26:21 +01:00
For example, when throwing a dice,
2017-02-11 12:13:09 +01:00
the union of the events
2017-04-21 08:30:15 +02:00
\[A=\textrm{''the outcome is even''}\]
2017-01-15 12:26:21 +01:00
and
2017-04-21 08:30:15 +02:00
\[B=\textrm{''the outcome is less than 4''}\]
2017-01-15 12:26:21 +01:00
is
2017-04-21 08:30:15 +02:00
\[A \cup B=\textrm{''the outcome is even or less than 4''},\]
2017-01-15 12:26:21 +01:00
and its probability is
2016-12-28 23:54:51 +01:00
\[P(A \cup B) = P(A)+P(B)-P(A \cap B)=1/2+1/2-1/6=5/6.\]
2017-01-15 12:26:21 +01:00
If the events $A$ and $B$ are \key{disjoint}, i.e.,
$A \cap B$ is empty,
the probability of the event $A \cup B$ is simply
2016-12-28 23:54:51 +01:00
\[P(A \cup B)=P(A)+P(B).\]
2017-01-15 12:26:21 +01:00
\subsubsection{Conditional probability}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
\index{conditional probability}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
The \key{conditional probability}
2016-12-28 23:54:51 +01:00
\[P(A | B) = \frac{P(A \cap B)}{P(B)}\]
2017-02-18 15:52:02 +01:00
is the probability of $A$
2017-02-11 12:13:09 +01:00
assuming that $B$ happens.
2017-05-09 22:32:59 +02:00
Hence, when calculating the
2017-01-15 12:26:21 +01:00
probability of $A$, we only consider the outcomes
that also belong to $B$.
2016-12-28 23:54:51 +01:00
2017-05-30 18:59:15 +02:00
Using the previous sets,
2016-12-28 23:54:51 +01:00
\[P(A | B)= 1/3,\]
2017-02-18 15:52:02 +01:00
because the outcomes of $B$ are
2017-02-11 12:13:09 +01:00
$\{1,2,3\}$, and one of them is even.
2017-04-21 08:30:15 +02:00
This is the probability of an even outcome
if we know that the outcome is between $1 \ldots 3$.
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
\subsubsection{Intersection}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
\index{independence}
2016-12-28 23:54:51 +01:00
2017-01-15 12:26:21 +01:00
Using conditional probability,
the probability of the intersection
$A \cap B$ can be calculated using the formula
2016-12-28 23:54:51 +01:00
\[P(A \cap B)=P(A)P(B|A).\]
2017-01-15 12:26:21 +01:00
Events $A$ and $B$ are \key{independent} if
\[P(A|B)=P(A) \hspace{10px}\textrm{and}\hspace{10px} P(B|A)=P(B),\]
2017-02-11 12:13:09 +01:00
which means that the fact that $B$ happens does not
2017-01-15 12:26:21 +01:00
change the probability of $A$, and vice versa.
In this case, the probability of the intersection is
2016-12-28 23:54:51 +01:00
\[P(A \cap B)=P(A)P(B).\]
2017-01-15 12:26:21 +01:00
For example, when drawing a card from a deck, the events
\[A = \textrm{''the suit is clubs''}\]
and
\[B = \textrm{''the value is four''}\]
are independent. Hence the event
\[A \cap B = \textrm{''the card is the four of clubs''}\]
happens with probability
2016-12-28 23:54:51 +01:00
\[P(A \cap B)=P(A)P(B)=1/4 \cdot 1/13 = 1/52.\]
2017-01-15 13:31:56 +01:00
\section{Random variables}
2017-01-15 13:07:39 +01:00
\index{random variable}
A \key{random variable} is a value that is generated
by a random process.
For example, when throwing two dice,
a possible random variable is
2017-04-21 08:30:15 +02:00
\[X=\textrm{''the sum of the outcomes''}.\]
For example, if the outcomes are $[4,6]$
2017-02-11 12:13:09 +01:00
(meaning that we first throw a four and then a six),
2017-01-15 13:07:39 +01:00
then the value of $X$ is 10.
We denote $P(X=x)$ the probability that
the value of a random variable $X$ is $x$.
2017-02-11 12:13:09 +01:00
For example, when throwing two dice,
$P(X=10)=3/36$,
because the total number of outcomes is 36
and there are three possible ways to obtain
the sum 10: $[4,6]$, $[5,5]$ and $[6,4]$.
2017-01-15 13:07:39 +01:00
\subsubsection{Expected value}
\index{expected value}
The \key{expected value} $E[X]$ indicates the
average value of a random variable $X$.
The expected value can be calculated as the sum
2016-12-28 23:54:51 +01:00
\[\sum_x P(X=x)x,\]
2017-02-11 12:13:09 +01:00
where $x$ goes through all possible values of $X$.
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
For example, when throwing a dice,
2017-04-21 08:30:15 +02:00
the expected outcome is
2016-12-28 23:54:51 +01:00
\[1/6 \cdot 1 + 1/6 \cdot 2 + 1/6 \cdot 3 + 1/6 \cdot 4 + 1/6 \cdot 5 + 1/6 \cdot 6 = 7/2.\]
2017-01-15 13:07:39 +01:00
A useful property of expected values is \key{linearity}.
It means that the sum
$E[X_1+X_2+\cdots+X_n]$
always equals the sum
$E[X_1]+E[X_2]+\cdots+E[X_n]$.
This formula holds even if random variables
depend on each other.
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
For example, when throwing two dice,
2017-02-11 12:13:09 +01:00
the expected sum is
2016-12-28 23:54:51 +01:00
\[E[X_1+X_2]=E[X_1]+E[X_2]=7/2+7/2=7.\]
2017-02-11 12:13:09 +01:00
Let us now consider a problem where
2017-01-15 13:07:39 +01:00
$n$ balls are randomly placed in $n$ boxes,
and our task is to calculate the expected
number of empty boxes.
Each ball has an equal probability to
be placed in any of the boxes.
For example, if $n=2$, the possibilities
are as follows:
2016-12-28 23:54:51 +01:00
\begin{center}
\begin{tikzpicture}
\draw (0,0) rectangle (1,1);
\draw (1.2,0) rectangle (2.2,1);
\draw (3,0) rectangle (4,1);
\draw (4.2,0) rectangle (5.2,1);
\draw (6,0) rectangle (7,1);
\draw (7.2,0) rectangle (8.2,1);
\draw (9,0) rectangle (10,1);
\draw (10.2,0) rectangle (11.2,1);
\draw[fill=blue] (0.5,0.2) circle (0.1);
\draw[fill=red] (1.7,0.2) circle (0.1);
\draw[fill=red] (3.5,0.2) circle (0.1);
\draw[fill=blue] (4.7,0.2) circle (0.1);
\draw[fill=blue] (6.25,0.2) circle (0.1);
\draw[fill=red] (6.75,0.2) circle (0.1);
\draw[fill=blue] (10.45,0.2) circle (0.1);
\draw[fill=red] (10.95,0.2) circle (0.1);
\end{tikzpicture}
\end{center}
2017-01-15 13:07:39 +01:00
In this case, the expected number of
empty boxes is
2016-12-28 23:54:51 +01:00
\[\frac{0+0+1+1}{4} = \frac{1}{2}.\]
2017-01-15 13:07:39 +01:00
In the general case, the probability that a
single box is empty is
2016-12-28 23:54:51 +01:00
\[\Big(\frac{n-1}{n}\Big)^n,\]
2017-01-15 13:07:39 +01:00
because no ball should be placed in it.
Hence, using linearity, the expected number of
empty boxes is
2016-12-28 23:54:51 +01:00
\[n \cdot \Big(\frac{n-1}{n}\Big)^n.\]
2017-01-15 13:07:39 +01:00
\subsubsection{Distributions}
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
\index{distribution}
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
The \key{distribution} of a random variable $X$
2017-02-11 12:13:09 +01:00
shows the probability of each value that
$X$ may have.
2017-01-15 13:07:39 +01:00
The distribution consists of values $P(X=x)$.
For example, when throwing two dice,
the distribution for their sum is:
2016-12-28 23:54:51 +01:00
\begin{center}
\small {
\begin{tabular}{r|rrrrrrrrrrrrr}
$x$ & 2 & 3 & 4 & 5 & 6 & 7 & 8 & 9 & 10 & 11 & 12 \\
$P(X=x)$ & $1/36$ & $2/36$ & $3/36$ & $4/36$ & $5/36$ & $6/36$ & $5/36$ & $4/36$ & $3/36$ & $2/36$ & $1/36$ \\
\end{tabular}
}
\end{center}
2017-01-15 13:07:39 +01:00
\index{uniform distribution}
In a \key{uniform distribution},
2017-02-11 12:13:09 +01:00
the random variable $X$ has $n$ possible
values $a,a+1,\ldots,b$ and the probability of each value is $1/n$.
For example, when throwing a dice,
$a=1$, $b=6$ and $P(X=x)=1/6$ for each value $x$.
2017-01-15 13:07:39 +01:00
2017-05-09 22:32:59 +02:00
The expected value of $X$ in a uniform distribution is
2016-12-28 23:54:51 +01:00
\[E[X] = \frac{a+b}{2}.\]
2017-05-09 22:32:59 +02:00
2017-01-15 13:07:39 +01:00
\index{binomial distribution}
In a \key{binomial distribution}, $n$ attempts
2017-02-18 15:52:02 +01:00
are made
2017-01-15 13:07:39 +01:00
and the probability that a single attempt succeeds
is $p$.
The random variable $X$ counts the number of
successful attempts,
2017-05-09 22:32:59 +02:00
and the probability of a value $x$ is
2016-12-28 23:54:51 +01:00
\[P(X=x)=p^x (1-p)^{n-x} {n \choose x},\]
2017-01-15 13:07:39 +01:00
where $p^x$ and $(1-p)^{n-x}$ correspond to
successful and unsuccessful attemps,
and ${n \choose x}$ is the number of ways
we can choose the order of the attempts.
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
For example, when throwing a dice ten times,
the probability of throwing a six exactly
three times is $(1/6)^3 (5/6)^7 {10 \choose 3}$.
2016-12-28 23:54:51 +01:00
2017-05-09 22:32:59 +02:00
The expected value of $X$ in a binomial distribution is
2016-12-28 23:54:51 +01:00
\[E[X] = pn.\]
2017-05-09 22:32:59 +02:00
2017-01-15 13:07:39 +01:00
\index{geometric distribution}
In a \key{geometric distribution},
the probability that an attempt succeeds is $p$,
2017-02-11 12:13:09 +01:00
and we continue until the first success happens.
2017-01-15 13:07:39 +01:00
The random variable $X$ counts the number
2017-05-09 22:32:59 +02:00
of attempts needed, and the probability of
2017-01-15 13:07:39 +01:00
a value $x$ is
2016-12-28 23:54:51 +01:00
\[P(X=x)=(1-p)^{x-1} p,\]
2017-05-09 22:32:59 +02:00
where $(1-p)^{x-1}$ corresponds to the unsuccessful attemps
2017-01-15 13:07:39 +01:00
and $p$ corresponds to the first successful attempt.
2016-12-28 23:54:51 +01:00
2017-01-15 13:07:39 +01:00
For example, if we throw a dice until we throw a six,
the probability that the number of throws
is exactly 4 is $(5/6)^3 1/6$.
2016-12-28 23:54:51 +01:00
2017-05-09 22:32:59 +02:00
The expected value of $X$ in a geometric distribution is
2016-12-28 23:54:51 +01:00
\[E[X]=\frac{1}{p}.\]
2017-01-15 13:31:56 +01:00
\section{Markov chains}
\index{Markov chain}
2017-02-26 10:57:14 +01:00
A \key{Markov chain}
% \footnote{A. A. Markov (1856--1922)
% was a Russian mathematician.}
is a random process
2017-01-15 13:31:56 +01:00
that consists of states and transitions between them.
For each state, we know the probabilities
for moving to other states.
A Markov chain can be represented as a graph
whose nodes are states and edges are transitions.
2017-05-09 22:32:59 +02:00
As an example, consider a problem
2017-02-11 12:13:09 +01:00
where we are in floor 1 in an $n$ floor building.
2017-01-15 13:31:56 +01:00
At each step, we randomly walk either one floor
up or one floor down, except that we always
walk one floor up from floor 1 and one floor down
from floor $n$.
2017-02-11 12:13:09 +01:00
What is the probability of being in floor $m$
2017-01-15 13:31:56 +01:00
after $k$ steps?
In this problem, each floor of the building
corresponds to a state in a Markov chain.
For example, if $n=5$, the graph is as follows:
2016-12-28 23:54:51 +01:00
\begin{center}
\begin{tikzpicture}[scale=0.9]
\node[draw, circle] (1) at (0,0) {$1$};
\node[draw, circle] (2) at (2,0) {$2$};
\node[draw, circle] (3) at (4,0) {$3$};
\node[draw, circle] (4) at (6,0) {$4$};
\node[draw, circle] (5) at (8,0) {$5$};
\path[draw,thick,->] (1) edge [bend left=40] node[font=\small,label=$1$] {} (2);
\path[draw,thick,->] (2) edge [bend left=40] node[font=\small,label=$1/2$] {} (3);
\path[draw,thick,->] (3) edge [bend left=40] node[font=\small,label=$1/2$] {} (4);
\path[draw,thick,->] (4) edge [bend left=40] node[font=\small,label=$1/2$] {} (5);
\path[draw,thick,->] (5) edge [bend left=40] node[font=\small,label=below:$1$] {} (4);
\path[draw,thick,->] (4) edge [bend left=40] node[font=\small,label=below:$1/2$] {} (3);
\path[draw,thick,->] (3) edge [bend left=40] node[font=\small,label=below:$1/2$] {} (2);
\path[draw,thick,->] (2) edge [bend left=40] node[font=\small,label=below:$1/2$] {} (1);
%\path[draw,thick,->] (1) edge [bend left=40] node[font=\small,label=below:$1$] {} (2);
\end{tikzpicture}
\end{center}
2017-01-15 13:31:56 +01:00
The probability distribution
of a Markov chain is a vector
$[p_1,p_2,\ldots,p_n]$, where $p_k$ is the
probability that the current state is $k$.
The formula $p_1+p_2+\cdots+p_n=1$ always holds.
2017-05-30 18:59:15 +02:00
In the above scenario, the initial distribution is
2017-02-11 12:13:09 +01:00
$[1,0,0,0,0]$, because we always begin in floor 1.
2017-01-15 13:31:56 +01:00
The next distribution is $[0,1,0,0,0]$,
because we can only move from floor 1 to floor 2.
After this, we can either move one floor up
or one floor down, so the next distribution is
2017-02-18 15:52:02 +01:00
$[1/2,0,1/2,0,0]$, and so on.
2017-01-15 13:31:56 +01:00
An efficient way to simulate the walk in
2017-02-11 12:13:09 +01:00
a Markov chain is to use dynamic programming.
2017-05-09 22:32:59 +02:00
The idea is to maintain the probability distribution,
2017-01-15 13:31:56 +01:00
and at each step go through all possibilities
how we can move.
2017-05-09 22:32:59 +02:00
Using this method, we can simulate
a walk of $m$ steps in $O(n^2 m)$ time.
2017-01-15 13:31:56 +01:00
The transitions of a Markov chain can also be
represented as a matrix that updates the
probability distribution.
2017-05-30 18:59:15 +02:00
In the above scenario, the matrix is
2016-12-28 23:54:51 +01:00
\[
\begin{bmatrix}
0 & 1/2 & 0 & 0 & 0 \\
1 & 0 & 1/2 & 0 & 0 \\
0 & 1/2 & 0 & 1/2 & 0 \\
0 & 0 & 1/2 & 0 & 1 \\
0 & 0 & 0 & 1/2 & 0 \\
\end{bmatrix}.
\]
2017-01-15 13:31:56 +01:00
When we multiply a probability distribution by this matrix,
we get the new distribution after moving one step.
For example, we can move from the distribution
$[1,0,0,0,0]$ to the distribution
$[0,1,0,0,0]$ as follows:
2016-12-28 23:54:51 +01:00
\[
\begin{bmatrix}
0 & 1/2 & 0 & 0 & 0 \\
1 & 0 & 1/2 & 0 & 0 \\
0 & 1/2 & 0 & 1/2 & 0 \\
0 & 0 & 1/2 & 0 & 1 \\
0 & 0 & 0 & 1/2 & 0 \\
\end{bmatrix}
\begin{bmatrix}
1 \\
0 \\
0 \\
0 \\
0 \\
\end{bmatrix}
=
\begin{bmatrix}
0 \\
1 \\
0 \\
0 \\
0 \\
\end{bmatrix}.
\]
2017-01-15 13:31:56 +01:00
By calculating matrix powers efficiently,
2017-02-11 12:13:09 +01:00
we can calculate the distribution after $m$ steps
in $O(n^3 \log m)$ time.
2016-12-28 23:54:51 +01:00
2017-01-15 14:45:45 +01:00
\section{Randomized algorithms}
\index{randomized algorithm}
Sometimes we can use randomness for solving a problem,
2017-02-11 12:13:09 +01:00
even if the problem is not related to probabilities.
2017-01-15 14:45:45 +01:00
A \key{randomized algorithm} is an algorithm that
is based on randomness.
\index{Monte Carlo algorithm}
A \key{Monte Carlo algorithm} is a randomized algorithm
that may sometimes give a wrong answer.
For such an algorithm to be useful,
the probability of a wrong answer should be small.
\index{Las Vegas algorithm}
A \key{Las Vegas algorithm} is a randomized algorithm
that always gives the correct answer,
but its running time varies randomly.
The goal is to design an algorithm that is
efficient with high probability.
Next we will go through three example problems that
can be solved using randomness.
\subsubsection{Order statistics}
\index{order statistic}
The $kth$ \key{order statistic} of an array
2017-02-11 12:13:09 +01:00
is the element at position $k$ after sorting
2017-01-15 14:45:45 +01:00
the array in increasing order.
2017-02-11 12:13:09 +01:00
It is easy to calculate any order statistic
2017-05-09 22:32:59 +02:00
in $O(n \log n)$ time by first sorting the array,
2017-02-11 12:13:09 +01:00
but is it really needed to sort the entire array
2017-02-18 15:52:02 +01:00
just to find one element?
2017-01-15 14:45:45 +01:00
It turns out that we can find order statistics
using a randomized algorithm without sorting the array.
2017-02-26 10:57:14 +01:00
The algorithm, called \key{quickselect}\footnote{In 1961,
C. A. R. Hoare published two algorithms that
are efficient on average: \index{quicksort} \index{quickselect}
\key{quicksort} \cite{hoa61a} for sorting arrays and
\key{quickselect} \cite{hoa61b} for finding order statistics.}, is a Las Vegas algorithm:
2017-02-11 12:13:09 +01:00
its running time is usually $O(n)$
2017-02-26 10:57:14 +01:00
but $O(n^2)$ in the worst case.
2017-01-15 14:45:45 +01:00
The algorithm chooses a random element $x$
2017-05-09 22:32:59 +02:00
of the array, and moves elements smaller than $x$
2017-01-15 14:45:45 +01:00
to the left part of the array,
2017-02-11 12:13:09 +01:00
and all other elements to the right part of the array.
2017-01-15 14:45:45 +01:00
This takes $O(n)$ time when there are $n$ elements.
Assume that the left part contains $a$ elements
and the right part contains $b$ elements.
2017-05-09 22:32:59 +02:00
If $a=k$, element $x$ is the $k$th order statistic.
Otherwise, if $a>k$, we recursively find the $k$th order
2017-01-15 14:45:45 +01:00
statistic for the left part,
2017-05-09 22:32:59 +02:00
and if $a<k$, we recursively find the $r$th order
2017-02-11 12:13:09 +01:00
statistic for the right part where $r=k-a$.
The search continues in a similar way, until the element
2017-01-15 14:45:45 +01:00
has been found.
When each element $x$ is randomly chosen,
the size of the array about halves at each step,
so the time complexity for
finding the $k$th order statistic is about
2017-05-30 18:59:15 +02:00
\[n+n/2+n/4+n/8+\cdots < 2n = O(n).\]
2016-12-28 23:54:51 +01:00
2017-05-09 22:32:59 +02:00
The worst case of the algorithm requires still $O(n^2)$ time,
2017-01-15 14:45:45 +01:00
because it is possible that $x$ is always chosen
2017-02-11 12:13:09 +01:00
in such a way that it is one of the smallest or largest
elements in the array and $O(n)$ steps are needed.
2017-01-15 14:45:45 +01:00
However, the probability for this is so small
that this never happens in practice.
\subsubsection{Verifying matrix multiplication}
\index{matrix multiplication}
Our next problem is to \emph{verify}
if $AB=C$ holds when $A$, $B$ and $C$
are matrices of size $n \times n$.
Of course, we can solve the problem
by calculating the product $AB$ again
(in $O(n^3)$ time using the basic algorithm),
but one could hope that verifying the
2017-02-11 12:13:09 +01:00
answer would by easier than to calculate it from scratch.
2017-01-15 14:45:45 +01:00
It turns out that we can solve the problem
2017-02-26 10:57:14 +01:00
using a Monte Carlo algorithm\footnote{R. M. Freivalds published
this algorithm in 1977 \cite{fre77}, and it is sometimes
called \index{Freivalds' algoritm} \key{Freivalds' algorithm}.} whose
time complexity is only $O(n^2)$.
2017-01-15 14:45:45 +01:00
The idea is simple: we choose a random vector
$X$ of $n$ elements, and calculate the matrices
$ABX$ and $CX$. If $ABX=CX$, we report that $AB=C$,
and otherwise we report that $AB \neq C$.
The time complexity of the algorithm is
$O(n^2)$, because we can calculate the matrices
$ABX$ and $CX$ in $O(n^2)$ time.
We can calculate the matrix $ABX$ efficiently
2017-02-11 12:13:09 +01:00
by using the representation $A(BX)$, so only two
2017-01-15 14:45:45 +01:00
multiplications of $n \times n$ and $n \times 1$
size matrices are needed.
2017-02-11 12:13:09 +01:00
The drawback of the algorithm is
2017-01-15 14:45:45 +01:00
that there is a small chance that the algorithm
makes a mistake when it reports that $AB=C$.
For example,
2016-12-28 23:54:51 +01:00
\[
\begin{bmatrix}
2017-04-24 20:36:01 +02:00
6 & 8 \\
1 & 3 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix}
\neq
\begin{bmatrix}
2017-04-24 20:36:01 +02:00
8 & 7 \\
3 & 2 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix},
\]
2017-01-15 14:45:45 +01:00
but
2016-12-28 23:54:51 +01:00
\[
\begin{bmatrix}
2017-04-24 20:36:01 +02:00
6 & 8 \\
1 & 3 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix}
\begin{bmatrix}
3 \\
2017-04-24 20:36:01 +02:00
6 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix}
=
\begin{bmatrix}
2017-04-24 20:36:01 +02:00
8 & 7 \\
3 & 2 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix}
\begin{bmatrix}
3 \\
2017-04-24 20:36:01 +02:00
6 \\
2016-12-28 23:54:51 +01:00
\end{bmatrix}.
\]
2017-01-15 14:45:45 +01:00
However, in practice, the probability that the
algorithm makes a mistake is small,
and we can decrease the probability by
verifying the result using multiple random vectors $X$
2017-02-11 12:13:09 +01:00
before reporting that $AB=C$.
2017-01-15 14:45:45 +01:00
\subsubsection{Graph coloring}
\index{coloring}
Given a graph that contains $n$ nodes and $m$ edges,
our task is to find a way to color the nodes
of the graph using two colors so that
2017-02-11 12:13:09 +01:00
for at least $m/2$ edges, the endpoints
2017-01-15 14:45:45 +01:00
have different colors.
For example, in the graph
2016-12-28 23:54:51 +01:00
\begin{center}
\begin{tikzpicture}[scale=0.9]
\node[draw, circle] (1) at (1,3) {$1$};
\node[draw, circle] (2) at (4,3) {$2$};
\node[draw, circle] (3) at (1,1) {$3$};
\node[draw, circle] (4) at (4,1) {$4$};
\node[draw, circle] (5) at (6,2) {$5$};
\path[draw,thick,-] (1) -- (2);
\path[draw,thick,-] (1) -- (3);
\path[draw,thick,-] (1) -- (4);
\path[draw,thick,-] (3) -- (4);
\path[draw,thick,-] (2) -- (4);
\path[draw,thick,-] (2) -- (5);
\path[draw,thick,-] (4) -- (5);
\end{tikzpicture}
\end{center}
2017-01-15 14:45:45 +01:00
a valid coloring is as follows:
2016-12-28 23:54:51 +01:00
\begin{center}
\begin{tikzpicture}[scale=0.9]
\node[draw, circle, fill=blue!40] (1) at (1,3) {$1$};
\node[draw, circle, fill=red!40] (2) at (4,3) {$2$};
\node[draw, circle, fill=red!40] (3) at (1,1) {$3$};
\node[draw, circle, fill=blue!40] (4) at (4,1) {$4$};
\node[draw, circle, fill=blue!40] (5) at (6,2) {$5$};
\path[draw,thick,-] (1) -- (2);
\path[draw,thick,-] (1) -- (3);
\path[draw,thick,-] (1) -- (4);
\path[draw,thick,-] (3) -- (4);
\path[draw,thick,-] (2) -- (4);
\path[draw,thick,-] (2) -- (5);
\path[draw,thick,-] (4) -- (5);
\end{tikzpicture}
\end{center}
2017-01-15 14:45:45 +01:00
The above graph contains 7 edges, and for 5 of them,
2017-02-11 12:13:09 +01:00
the endpoints have different colors,
2017-01-15 14:45:45 +01:00
so the coloring is valid.
The problem can be solved using a Las Vegas algorithm
that generates random colorings until a valid coloring
has been found.
In a random coloring, the color of each node is
independently chosen so that the probability of
both colors is $1/2$.
2017-02-11 12:13:09 +01:00
In a random coloring, the probability that the endpoints
2017-01-15 14:45:45 +01:00
of a single edge have different colors is $1/2$.
2017-02-11 12:13:09 +01:00
Hence, the expected number of edges whose endpoints
2017-02-18 15:52:02 +01:00
have different colors is $m/2$.
2017-04-21 08:42:07 +02:00
Since it is expected that a random coloring is valid,
2017-02-11 12:13:09 +01:00
we will quickly find a valid coloring in practice.
2016-12-28 23:54:51 +01:00