Small improvements
This commit is contained in:
parent
02465b379b
commit
e8608fa76f
|
@ -7,24 +7,22 @@ that has a square root in its time complexity.
|
||||||
A square root can be seen as a ''poor man's logarithm'':
|
A square root can be seen as a ''poor man's logarithm'':
|
||||||
the complexity $O(\sqrt n)$ is better than $O(n)$
|
the complexity $O(\sqrt n)$ is better than $O(n)$
|
||||||
but worse than $O(\log n)$.
|
but worse than $O(\log n)$.
|
||||||
In any case, many square root algorithms are fast in practice
|
In any case, many square root algorithms are fast and usable in practice.
|
||||||
and have small constant factors.
|
|
||||||
|
|
||||||
As an example, let us consider the problem of
|
As an example, let us consider the problem of
|
||||||
creating a data structure that supports
|
creating a data structure that supports
|
||||||
two operations on an array:
|
two operations on an array:
|
||||||
modifying an element at a given position
|
modifying an element at a given position
|
||||||
and calculating the sum of elements in the given range.
|
and calculating the sum of elements in the given range.
|
||||||
|
|
||||||
We have previously solved the problem using
|
We have previously solved the problem using
|
||||||
a binary indexed tree and segment tree,
|
binary indexed and segment trees,
|
||||||
that support both operations in $O(\log n)$ time.
|
that support both operations in $O(\log n)$ time.
|
||||||
However, now we will solve the problem
|
However, now we will solve the problem
|
||||||
in another way using a square root structure
|
in another way using a square root structure
|
||||||
that allows us to modify elements in $O(1)$ time
|
that allows us to modify elements in $O(1)$ time
|
||||||
and calculate sums in $O(\sqrt n)$ time.
|
and calculate sums in $O(\sqrt n)$ time.
|
||||||
|
|
||||||
The idea is to divide the array into blocks
|
The idea is to divide the array into \emph{blocks}
|
||||||
of size $\sqrt n$ so that each block contains
|
of size $\sqrt n$ so that each block contains
|
||||||
the sum of elements inside the block.
|
the sum of elements inside the block.
|
||||||
For example, an array of 16 elements will be
|
For example, an array of 16 elements will be
|
||||||
|
@ -64,8 +62,8 @@ divided into blocks of 4 elements as follows:
|
||||||
\end{tikzpicture}
|
\end{tikzpicture}
|
||||||
\end{center}
|
\end{center}
|
||||||
|
|
||||||
Using this structure,
|
In this structure,
|
||||||
it is easy to modify the array,
|
it is easy to modify array elements,
|
||||||
because it is only needed to update
|
because it is only needed to update
|
||||||
the sum of a single block
|
the sum of a single block
|
||||||
after each modification,
|
after each modification,
|
||||||
|
@ -159,9 +157,9 @@ and sums of blocks between them:
|
||||||
\end{center}
|
\end{center}
|
||||||
|
|
||||||
Since the number of single elements is $O(\sqrt n)$
|
Since the number of single elements is $O(\sqrt n)$
|
||||||
and also the number of blocks is $O(\sqrt n)$,
|
and the number of blocks is also $O(\sqrt n)$,
|
||||||
the time complexity of the sum query is $O(\sqrt n)$.
|
the time complexity of the sum query is $O(\sqrt n)$.
|
||||||
Thus, the parameter $\sqrt n$ balances two things:
|
In this case, the parameter $\sqrt n$ balances two things:
|
||||||
the array is divided into $\sqrt n$ blocks,
|
the array is divided into $\sqrt n$ blocks,
|
||||||
each of which contains $\sqrt n$ elements.
|
each of which contains $\sqrt n$ elements.
|
||||||
|
|
||||||
|
@ -182,8 +180,9 @@ elements.
|
||||||
\index{batch processing}
|
\index{batch processing}
|
||||||
|
|
||||||
Sometimes the operations of an algorithm
|
Sometimes the operations of an algorithm
|
||||||
can be divided into batches,
|
can be divided into \emph{batches}.
|
||||||
each of which can be processed separately.
|
Each batch contains a sequence of operations
|
||||||
|
which will be processed one after another.
|
||||||
Some precalculation is done
|
Some precalculation is done
|
||||||
between the batches
|
between the batches
|
||||||
in order to process the future operations more efficiently.
|
in order to process the future operations more efficiently.
|
||||||
|
@ -227,8 +226,8 @@ $O((k^2+n) \sqrt n)$ time.
|
||||||
First, there are $O(\sqrt n)$ breadth-first searches
|
First, there are $O(\sqrt n)$ breadth-first searches
|
||||||
and each search takes $O(k^2)$ time.
|
and each search takes $O(k^2)$ time.
|
||||||
Second, the total number of
|
Second, the total number of
|
||||||
squares processed during the algorithm
|
distances calculated during the algorithm
|
||||||
is $O(n)$, and at each square,
|
is $O(n)$, and when calculating each distance,
|
||||||
we go through a list of $O(\sqrt n)$ squares.
|
we go through a list of $O(\sqrt n)$ squares.
|
||||||
|
|
||||||
If the algorithm would perform a breadth-first search
|
If the algorithm would perform a breadth-first search
|
||||||
|
@ -243,8 +242,8 @@ but in addition, a factor of $n$ is replaced by $\sqrt n$.
|
||||||
|
|
||||||
\section{Subalgorithms}
|
\section{Subalgorithms}
|
||||||
|
|
||||||
Some square root algorithms consists of
|
Some square root algorithms consist of
|
||||||
subalgorithms that are specialized for different
|
\emph{subalgorithms} that are specialized for different
|
||||||
input parameters.
|
input parameters.
|
||||||
Typically, there are two subalgorithms:
|
Typically, there are two subalgorithms:
|
||||||
one algorithm is efficient when
|
one algorithm is efficient when
|
||||||
|
@ -319,6 +318,8 @@ is named after Mo Tao, a Chinese competitive programmer, but
|
||||||
the technique has appeared earlier in the literature.} can be used in many problems
|
the technique has appeared earlier in the literature.} can be used in many problems
|
||||||
that require processing range queries in
|
that require processing range queries in
|
||||||
a \emph{static} array.
|
a \emph{static} array.
|
||||||
|
Since the array is static, the queries can be
|
||||||
|
processed in any order.
|
||||||
Before processing the queries, the algorithm
|
Before processing the queries, the algorithm
|
||||||
sorts them in a special order which guarantees
|
sorts them in a special order which guarantees
|
||||||
that the algorithm works efficiently.
|
that the algorithm works efficiently.
|
||||||
|
@ -344,9 +345,9 @@ in the range, and secondarily by the position of the
|
||||||
last element in the range.
|
last element in the range.
|
||||||
It turns out that using this order, the algorithm
|
It turns out that using this order, the algorithm
|
||||||
only performs $O(n \sqrt n)$ operations,
|
only performs $O(n \sqrt n)$ operations,
|
||||||
because the left endpoint of the range moves
|
because the left endpoint moves
|
||||||
$n$ times $O(\sqrt n)$ steps,
|
$n$ times $O(\sqrt n)$ steps,
|
||||||
and the right endpoint of the range moves
|
and the right endpoint moves
|
||||||
$\sqrt n$ times $O(n)$ steps. Thus, both
|
$\sqrt n$ times $O(n)$ steps. Thus, both
|
||||||
endpoints move a total of $O(n \sqrt n)$ steps during the algorithm.
|
endpoints move a total of $O(n \sqrt n)$ steps during the algorithm.
|
||||||
|
|
||||||
|
@ -362,7 +363,7 @@ In Mo's algorithm, the queries are always sorted
|
||||||
in the same way, but it depends on the problem
|
in the same way, but it depends on the problem
|
||||||
how the answer to the query is maintained.
|
how the answer to the query is maintained.
|
||||||
In this problem, we can maintain an array
|
In this problem, we can maintain an array
|
||||||
\texttt{c} where $\texttt{c}[x]$
|
\texttt{count} where $\texttt{count}[x]$
|
||||||
indicates the number of times an element $x$
|
indicates the number of times an element $x$
|
||||||
occurs in the active range.
|
occurs in the active range.
|
||||||
|
|
||||||
|
@ -404,18 +405,18 @@ there will be three steps:
|
||||||
the left endpoint moves one step to the right,
|
the left endpoint moves one step to the right,
|
||||||
and the right endpoint moves two steps to the right.
|
and the right endpoint moves two steps to the right.
|
||||||
|
|
||||||
After each step, the array \texttt{c}
|
After each step, the array \texttt{count}
|
||||||
needs to be updated.
|
needs to be updated.
|
||||||
After adding an element $x$,
|
After adding an element $x$,
|
||||||
we increase the value of
|
we increase the value of
|
||||||
$\texttt{c}[x]$ by one,
|
$\texttt{count}[x]$ by 1,
|
||||||
and if $\texttt{c}[x]=1$ after this,
|
and if $\texttt{count}[x]=1$ after this,
|
||||||
we also increase the answer to the query by one.
|
we also increase the answer to the query by 1.
|
||||||
Similarly, after removing an element $x$,
|
Similarly, after removing an element $x$,
|
||||||
we decrease the value of
|
we decrease the value of
|
||||||
$\texttt{c}[x]$ by one,
|
$\texttt{count}[x]$ by 1,
|
||||||
and if $\texttt{c}[x]=0$ after this,
|
and if $\texttt{count}[x]=0$ after this,
|
||||||
we also decrease the answer to the query by one.
|
we also decrease the answer to the query by 1.
|
||||||
|
|
||||||
In this problem, the time needed to perform
|
In this problem, the time needed to perform
|
||||||
each step is $O(1)$, so the total time complexity
|
each step is $O(1)$, so the total time complexity
|
||||||
|
|
Loading…
Reference in New Issue