Complexity

Experimental html version of downloadable textbook, see https://theartofhpc.com
\[ % mathjax inclusion. \newcommand\bbP{\mathbb{P}} \newcommand\bbR{\mathbb{R}} \newcommand\becomes{\mathop{:=}} \newcommand\dtdxx{\frac{\alpha\Delta t}{\Delta x^2}} \newcommand\defined{ \mathrel{\lower 5pt \hbox{${\equiv\atop\mathrm{\scriptstyle D}}$}}} \newcommand\fp[2]{#1\cdot10^{#2}} \newcommand\inv{^{-1}}\newcommand\invt{^{-t}} \newcommand\macro[1]{$\langle$#1$\rangle$} \newcommand\nobreak{} \newcommand\Rn{{\cal R}^n} \newcommand\Rnxn{{\cal R}^{n\times x}} \newcommand\sublocal{_{\mathrm\scriptstyle local}} \newcommand\toprule{\hline} \newcommand\midrule{\hline} \newcommand\bottomrule{\hline} \newcommand\multicolumn[3]{#3} \newcommand\defcolvector[2]{\begin{pmatrix} #1_0

#1_1


vdots

#1_{#2-1} \end{pmatrix}} % {
left(
begin{array}{c} #1_0

#1_1


vdots

#1_{#2-1}
end{array}
right) } \] 14.1 : Formal definitions
14.2 : The Master Theorem
14.2.1 : Quicksort and mergesort
14.2.2 : Matrix-matrix multiplication
Back to Table of Contents

14 Complexity

At various places in this book we are interested in how many operations an algorithm takes. It depends on the context what these operations are, but often we count additions (or subtractions) and multiplications. This is called the arithmetic or computational complexity of an algorithm. For instance, summing $n$ numbers takes $n-1$ additions. Another quantity that we may want to describe is the amount of space (computer memory) that is needed. Sometimes the space to fit the input and output of an algorithm is all that is needed, but some algorithms need temporary space. The total required space is called the space complexity of an algorithm.

Both arithmetic and space complexity depend on some description of the input, for instance, for summing an array of numbers, the length $n$ of the array is all that is needed. We express this dependency by saying `summing an array of numbers has time complexity $n-1$ additions, where $n$ is the length of the array'.

The time (or space) the summing algorithm takes is not dependent on other factors such as the values of the numbers. By contrast, some algorithms such as computing the greatest common divisor of an array of integers can be dependent on the actual values. In the section on sorting Molecular dynamics you will see both kinds.

Exercise What is the time and space complexity of multiplying two square matrices of size $n\times n$? Assume that an addition and a multiplication take the same amount of time.
End of exercise

Often we aim to simplify the formulas that describe time or space complexity. For instance, if the complexity of an algorithm is $n^2+2n$, we see that for $n>2$ the complexity is less than $2n^2$, and for $n>4$ it is less than $(3/2)n^2$. On the other hand, for all values of $n$ the complexity is at least $n^2$. Clearly, the quadratic term $n^2$ is the most important, and the linear term $n$ becomes less and less important by ratio. We express this informally by saying that the complexity is quadratic in $n$ as $n\rightarrow\infty$: there are constants $c,C$ so that for $n$ large enough the complexity is at least $cn^2$ and at most $Cn^2$.

This is expressed for short by saying that the complexity is of order $n^2$, written as $O(n^2)$ as $n\rightarrow\infty$. In chapter  Numerical treatment of differential equations you will see phenomena that can be described as orders of a parameter that goes to zero. In that case we write for instance $f(h)=O(h^2)$ as $h\downarrow0$, meaning that $f$ is bounded by $ch^2$ and $Ch^2$ for certain constants $c,C$ and $h$ small enough.

14.1 Formal definitions

crumb trail: > complexity > Formal definitions

Formally, we distinguish the following complexity types:

14.2 The Master Theorem

crumb trail: > complexity > The Master Theorem

For many operations, complexity is easily expressed recursively. For instance, the complexity of quicksort obeys a recurrence \begin{equation} T(n) = 2T(n/2) + O(n), \end{equation} expressing that sorting an array of $n$ numbers involves twice sorting $n/2$ numbers, plus the splitting step, which is linear in the array size.

The so-called master theorem of complexity allows you to convert these recursive expression to a closed form formula.

Suppose the function $T$ satisfies \begin{equation} T(n) = a T\bigl( \frac{n}{b} \bigr) + f(n) \end{equation} with $a,b\geq 1$ and $f(\cdot)$ asymptotically positive. Then there are three cases:

  1. If

    \begin{equation} f(n)=O\left( n^{\log_b(a)-\epsilon} \right ) \quad\hbox{then}\quad T(n) = \Theta \left( n^{\log_b(a)} \right ) \end{equation}

  2. If

    \begin{equation} f(n)=O\left( n^{\log_b(a)} \right ) \quad\hbox{then}\quad T(n) = \Theta \left( n^{\log_b(a)}\log(n) \right ) \end{equation}

  3. If

    \begin{equation} f(n)=O\left( n^{\log_b(a)+\epsilon} \right ) \quad\hbox{then}\quad T(n) = \Theta \left( f(n) \right ) \end{equation}

Some examples follow.

14.2.1 Quicksort and mergesort

crumb trail: > complexity > The Master Theorem > Quicksort and mergesort

The quicksort and mergesort algorithms recursively divide work into two (hopefully) equal halves, preceeded/followed by a linear splitting/merging step. Thus their complexity satisfies \begin{equation} T(n) = 2T(n/2) + O(n) \end{equation} giving, with $a=b=2$ and $f(n)=n$, by case 2: \begin{equation} T(n) = \Theta( n\log n ). \end{equation}

14.2.2 Matrix-matrix multiplication

crumb trail: > complexity > The Master Theorem > Matrix-matrix multiplication

We already know that the ordinary triple-loop formulation of matrix-matrix multiplication has a complexity of $O(n^3)$. Multiplying matrices by recursive bisection into $2\times 2$ blocks gives \begin{equation} T(n) = 8T(n/2) + O(n^2). \end{equation} Substituting $a=8,b=2$ gives $\log_b(a)=3$ so case 1 applies with $\epsilon=1$, and we find \begin{equation} T(n) = \Theta( n^3 ). \end{equation}

Slightly more interesting, the Strassen algorithm  [St:gaussnotoptimal] has $a=7$, giving an essentially lower complexity of $O(n^{\log 7}) \approx O(n^{2.81})$. There are other algorithms like this  [Pa:combinations]  , with slightly lower complexities, though all still well above $O(n^2)$.

Back to Table of Contents