Growth of Functions

The analysis of algorithms often requires a body of mathematical tools. In this lecture, we introduce some important tools and standards of notation.

1  Asymptotic Notation

The notation used to describe the asymptotic running time of an algorithm is defined in terms of functions whose domains are the set of natural numbers.
This notation refers to how the problem scales as the problem gets larger. In some cases it is important to look at how fast the problem is for small amounts of input, particularly when the same task is being done over and over again. However, we are mainly concerned with algorithms for large problems, hence how the performance scales as the problem size approaches infinity is what we will look at.

1.1   O-notation

O-notation, pronounced "big-oh notation", is used to describe the asymptotic upper bound of an algorithm. Formally, it is defined as:
For a given function, g(n) , we denote by O(g(n)) the set of functions, O(g(n)) = { f(n): there exist positive constants c and n0 such that 0f(n)cg(n) for all n n0 }.
Less formally, this means that for all sufficiently big n, the running time (space requirements etc) of the algorithm is less than g(n) multiplied by some constant. Even less formally, if f(n)=O( n2 ) then we can say that f(n) runs no slower than n2 .
images/Introduction_BigOGraph.png
Figure 1: "There exist positive constants c and n0 such that 0f(n)cg(n) for all n n0 ." f(n) is thus O(g(n)).
For example:

1.2   Ω-notation

Ω-notation, pronounced "big-omega notation", is used to describe the asymptotic lower bound of an algorithm. Formally, it is defined as:
For a given function, g(n), we denote by Ω(g(n)) the set of functions, Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0cg(n)f(n) for all n n0 }.
Less formally, this means that for all sufficiently big n, the running time (space requirements etc) of the algorithm is greater than g(n) multiplied by some constant. Even less formally, if f(n)=Ω( n2 ) then we can say that f(n) runs no faster than n2 .
images/Introduction_BigOmegaGraph.png
Figure 2: "There exist positive constants c and n0 such that 0cg(n)f(n) for all n n0 ." f(n) is thus Ω(g(n)).
For example:

1.3   Θ-notation

We can now talk about algorithms in terms of a limit to how fast or how slow they run. However, our notation is not asymptotically tight. If we say that an algorithm runs in f(n)=O( n2 ), then that algorithm could actually run in n time. Usually we want to have a tight bound, we want to be able to say that an algorithm runs no slower and no faster than a particular function. To do this, we use Θ-notation.
For a given function, g(n), we denote by Θ(g(n)) the set of functions Θ(g(n)) = { f(n): there exist positive constants c1 , c2 , and n0 such that 0 c1 g(n)f(n) c2 g(n) for all n n0 }.
images/Introduction_ThetaGraph.png
Figure 3: "There exist positive constants c1 , c2 , and n0 such that 0 c1 g(n)f(n) c2 g(n) for all n n0 ". f(n) is thus Θ(g(n)).
For example:

1.4   o-notation

Sometimes we want to identify bounds that are not asymptotically tight, o-notation, pronounced little-oh notation. is used to denote an upper bound that is not asymptotically tight.
For a given function, g(n), we denote by o(g(n)) the set of functions o(g(n)) = { f(n): for any positive constant c>0, there exists a constant n0 >0 such that 0f(n)cg(n) for all n n0 }.
The implication of this, is that f(n) becomes insignificant relative to g(n) as n approaches infinity:

limn f(n) g(n) =0

For example, 2n=o( n2 ) but 2 n2 o( n2 ).

1.5   ω-notation

ω-notation, pronounced little-omega notation, is used to denote a lower bound that is not asymptotically tight.
For a given function, g(n), we denote by ω(g(n)) the set of functions ω(g(n)) = { f(n): for any positive constant c>0, there exists a constant n0 >0 such that 0cg(n)f(n) for all n n0 }.
The implication of this is that g(n) becomes insignificant relative to f(n) as n approaches infinity:

limn f(n) g(n) =

For example n2 2 =ω(n) but n2 2 ω( n2 ).



File translated from TEX by TTM, version 3.67.
On 31 Mar 2006, 18:12.