Skip to main content
\(\newcommand{\set}[1]{\{1,2,\dotsc,#1\,\}} \newcommand{\ints}{\mathbb{Z}} \newcommand{\posints}{\mathbb{N}} \newcommand{\rats}{\mathbb{Q}} \newcommand{\reals}{\mathbb{R}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\twospace}{\mathbb{R}^2} \newcommand{\threepace}{\mathbb{R}^3} \newcommand{\dspace}{\mathbb{R}^d} \newcommand{\nni}{\mathbb{N}_0} \newcommand{\nonnegints}{\mathbb{N}_0} \newcommand{\dom}{\operatorname{dom}} \newcommand{\ran}{\operatorname{ran}} \newcommand{\prob}{\operatorname{prob}} \newcommand{\Prob}{\operatorname{Prob}} \newcommand{\height}{\operatorname{height}} \newcommand{\width}{\operatorname{width}} \newcommand{\length}{\operatorname{length}} \newcommand{\crit}{\operatorname{crit}} \newcommand{\inc}{\operatorname{inc}} \newcommand{\HP}{\mathbf{H_P}} \newcommand{\HCP}{\mathbf{H^c_P}} \newcommand{\GP}{\mathbf{G_P}} \newcommand{\GQ}{\mathbf{G_Q}} \newcommand{\AG}{\mathbf{A_G}} \newcommand{\GCP}{\mathbf{G^c_P}} \newcommand{\PXP}{\mathbf{P}=(X,P)} \newcommand{\QYQ}{\mathbf{Q}=(Y,Q)} \newcommand{\GVE}{\mathbf{G}=(V,E)} \newcommand{\HWF}{\mathbf{H}=(W,F)} \newcommand{\bfC}{\mathbf{C}} \newcommand{\bfG}{\mathbf{G}} \newcommand{\bfH}{\mathbf{H}} \newcommand{\bfF}{\mathbf{F}} \newcommand{\bfI}{\mathbf{I}} \newcommand{\bfK}{\mathbf{K}} \newcommand{\bfP}{\mathbf{P}} \newcommand{\bfQ}{\mathbf{Q}} \newcommand{\bfR}{\mathbf{R}} \newcommand{\bfS}{\mathbf{S}} \newcommand{\bfT}{\mathbf{T}} \newcommand{\bfNP}{\mathbf{NP}} \newcommand{\bftwo}{\mathbf{2}} \newcommand{\cgA}{\mathcal{A}} \newcommand{\cgB}{\mathcal{B}} \newcommand{\cgC}{\mathcal{C}} \newcommand{\cgD}{\mathcal{D}} \newcommand{\cgE}{\mathcal{E}} \newcommand{\cgF}{\mathcal{F}} \newcommand{\cgG}{\mathcal{G}} \newcommand{\cgM}{\mathcal{M}} \newcommand{\cgN}{\mathcal{N}} \newcommand{\cgP}{\mathcal{P}} \newcommand{\cgR}{\mathcal{R}} \newcommand{\cgS}{\mathcal{S}} \newcommand{\bfn}{\mathbf{n}} \newcommand{\bfm}{\mathbf{m}} \newcommand{\bfk}{\mathbf{k}} \newcommand{\bfs}{\mathbf{s}} \newcommand{\bijection}{\xrightarrow[\text{onto}]{\text{$1$--$1$}}} \newcommand{\injection}{\xrightarrow[]{\text{$1$--$1$}}} \newcommand{\surjection}{\xrightarrow[\text{onto}]{}} \newcommand{\nin}{\not\in} \newcommand{\prufer}{\mbox{prüfer}} \DeclareMathOperator{\fix}{fix} \DeclareMathOperator{\stab}{stab} \DeclareMathOperator{\var}{var} \newcommand{\inv}{^{-1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

Section4.3The Big “Oh” and Little “Oh” Notations

Let \(f:\posints\longrightarrow \reals\) and \(g:\posints\longrightarrow\reals\) be functions. We write \(f=O(g)\text{,}\) and say \(f\) is “Big Oh” of \(g\text{,}\) when there is a constant \(c\) and an integer \(n_0\) so that \(f(n)\le cg(n)\) whenever \(n>n_0\text{.}\) Although this notation has a long history, we can provide a quite modern justification. If \(f\) and \(g\) both describe the number of operations required for two algorithms given input size \(n\text{,}\) then the meaning of \(f=O(g)\) is that \(f\) is no harder than \(g\) when the problem size is large.

We are particularly interested in comparing functions against certain natural benchmarks, e.g., \(\log\log n\text{,}\) \(\log n\text{,}\) \(\sqrt{n}\text{,}\) \(n^\alpha\) where \(\alpha\lt 1\text{,}\) \(n\text{,}\) \(n^2\text{,}\) \(n^3\text{,}\) \(n^c\) where \(c>1\) is a constant, \(n^{\log n}\text{,}\) \(2^n\text{,}\) \(n!\text{,}\) \(2^{n^2}\text{,}\) etc.

For example, in Subsection 3.5.2 we learned that there are sorting algorithms with running time \(O(n\log n)\) where \(n\) is the number of integers to be sorted. As a second example, we will learn that we can find all shortest paths in an oriented graph on \(n\) vertices with non-negative weights on edges with an algorithm having running time \(O(n^2)\text{.}\) At the other extreme, no one knows whether there is a constant \(c\) and an algorithm for determining whether the chromatic number of a graph is at most three which has running time \(O(n^c)\text{.}\)

It is important to remember that when we write \(f=O(g)\text{,}\) we are implying in some sense that \(f\) is no bigger than \(g\text{,}\) but it may in fact be much smaller. By contrast, there will be times when we really know that one function dominates another. And we have a second kind of notation to capture this relationship.

Let \(f:\posints\longrightarrow \reals\) and \(g:\posints\longrightarrow\reals\) be functions with \(f(n)>0\) and \(g(n)>0\) for all \(n\text{.}\) We write \(f=o(g)\text{,}\) and say that \(f\) is “Little oh” of \(g\text{,}\) when \(\lim_{n\rightarrow\infty}f(n)/g(n)=0\text{.}\) For example \(\ln n=o(n^{.2})\text{;}\) \(n^\alpha=o(n^{\beta})\) whenever \(0\lt \alpha\lt \beta\text{;}\) and \(n^{100}=o(c^n)\) for every \(c>1\text{.}\) In particular, we write \(f(n)=o(1)\) when \(\lim_{n\rightarrow\infty}f(n)=0\text{.}\)