Small fixes

This commit is contained in:
Antti H S Laaksonen 2017-02-26 11:57:14 +02:00
parent ee009bd9dd
commit 8a936ed246
3 changed files with 22 additions and 15 deletions

View File

@ -245,8 +245,9 @@ two $n \times n$ matrices
in $O(n^3)$ time. in $O(n^3)$ time.
There are also more efficient algorithms There are also more efficient algorithms
for matrix multiplication\footnote{The first such for matrix multiplication\footnote{The first such
algorithm, with time complexity $O(n^{2.80735})$, algorithm was Strassen's algorithm,
was published in 1969 \cite{str69}, and published in 1969 \cite{str69},
whose time complexity is $O(n^{2.80735})$;
the best current algorithm the best current algorithm
works in $O(n^{2.37286})$ time \cite{gal14}.}, works in $O(n^{2.37286})$ time \cite{gal14}.},
but they are mostly of theoretical interest but they are mostly of theoretical interest
@ -749,8 +750,9 @@ $2 \rightarrow 1 \rightarrow 4 \rightarrow 2 \rightarrow 5$.
\index{Kirchhoff's theorem} \index{Kirchhoff's theorem}
\index{spanning tree} \index{spanning tree}
\key{Kirchhoff's theorem}\footnote{G. R. Kirchhoff (1824--1887) was \key{Kirchhoff's theorem}
a German physicist.} provides a way %\footnote{G. R. Kirchhoff (1824--1887) was a German physicist.}
provides a way
to calculate the number of spanning trees to calculate the number of spanning trees
of a graph as a determinant of a special matrix. of a graph as a determinant of a special matrix.
For example, the graph For example, the graph

View File

@ -359,8 +359,10 @@ The expected value for $X$ in a geometric distribution is
\index{Markov chain} \index{Markov chain}
A \key{Markov chain}\footnote{A. A. Markov (1856--1922) A \key{Markov chain}
was a Russian mathematician.} is a random process % \footnote{A. A. Markov (1856--1922)
% was a Russian mathematician.}
is a random process
that consists of states and transitions between them. that consists of states and transitions between them.
For each state, we know the probabilities For each state, we know the probabilities
for moving to other states. for moving to other states.
@ -515,11 +517,13 @@ just to find one element?
It turns out that we can find order statistics It turns out that we can find order statistics
using a randomized algorithm without sorting the array. using a randomized algorithm without sorting the array.
The algorithm is a Las Vegas algorithm: The algorithm, called \key{quickselect}\footnote{In 1961,
C. A. R. Hoare published two algorithms that
are efficient on average: \index{quicksort} \index{quickselect}
\key{quicksort} \cite{hoa61a} for sorting arrays and
\key{quickselect} \cite{hoa61b} for finding order statistics.}, is a Las Vegas algorithm:
its running time is usually $O(n)$ its running time is usually $O(n)$
but $O(n^2)$ in the worst case\footnote{C. A. R. Hoare but $O(n^2)$ in the worst case.
discovered both this algorithm, known as \key{quickselect} \cite{hoa61b},
and a similar sorting algorithm, known as \key{quicksort} \cite{hoa61a}.}.
The algorithm chooses a random element $x$ The algorithm chooses a random element $x$
in the array, and moves elements smaller than $x$ in the array, and moves elements smaller than $x$
@ -563,9 +567,10 @@ but one could hope that verifying the
answer would by easier than to calculate it from scratch. answer would by easier than to calculate it from scratch.
It turns out that we can solve the problem It turns out that we can solve the problem
using a Monte Carlo algorithm whose using a Monte Carlo algorithm\footnote{R. M. Freivalds published
time complexity is only $O(n^2)$\footnote{This algorithm is sometimes this algorithm in 1977 \cite{fre77}, and it is sometimes
called \index{Freivalds' algoritm} \key{Freivalds' algorithm} \cite{fre77}.}. called \index{Freivalds' algoritm} \key{Freivalds' algorithm}.} whose
time complexity is only $O(n^2)$.
The idea is simple: we choose a random vector The idea is simple: we choose a random vector
$X$ of $n$ elements, and calculate the matrices $X$ of $n$ elements, and calculate the matrices
$ABX$ and $CX$. If $ABX=CX$, we report that $AB=C$, $ABX$ and $CX$. If $ABX=CX$, we report that $AB=C$,

View File

@ -249,7 +249,7 @@ It turns out that we can easily classify
any nim state by calculating any nim state by calculating
the \key{nim sum} $x_1 \oplus x_2 \oplus \cdots \oplus x_n$, the \key{nim sum} $x_1 \oplus x_2 \oplus \cdots \oplus x_n$,
where $\oplus$ is the xor operation\footnote{The optimal strategy where $\oplus$ is the xor operation\footnote{The optimal strategy
for nim was published in 1901 by C. L. Bouton \cite{bou01}}. for nim was published in 1901 by C. L. Bouton \cite{bou01}.}.
The states whose nim sum is 0 are losing states, The states whose nim sum is 0 are losing states,
and all other states are winning states. and all other states are winning states.
For example, the nim sum for For example, the nim sum for
@ -369,7 +369,7 @@ so the nim sum is not 0.
\index{SpragueGrundy theorem} \index{SpragueGrundy theorem}
The \key{SpragueGrundy theorem}\footnote{The theorem was discovered The \key{SpragueGrundy theorem}\footnote{The theorem was discovered
independently by R. Sprague \cite{spr35} and P. M. Grundy \cite{gru39}} generalizes the independently by R. Sprague \cite{spr35} and P. M. Grundy \cite{gru39}.} generalizes the
strategy used in nim to all games that fulfil strategy used in nim to all games that fulfil
the following requirements: the following requirements: