The notations we use to describe the asymptotic running time of an algorithm are defined in terms of functions whose domains are the set of natural numbers **N** = {0, 1, 2, . . .}. Such notations are convenient for describing the worst-case running-time function *T*(*n*), which is usually defined only on integer input sizes. It is sometimes convenient, however, to *abuse* asymptotic notation in a variety of ways. For example, the notation is easily extended to the domain of real numbers or, alternatively, restricted to a subset of the natural numbers. It is important, however, to understand the precise meaning of the notation so that when it is abused, it is not *misused*. This section defines the basic asymptotic notations and also introduces some common abuses.

In Chapter 1, we found that the worst-case running time of insertion sort is *T*(*n*) = (*n*^{2}). Let us define what this notation means. For a given function *g*(*n*), we denote by (*g*(*n*)) the *set of functions*

(*g*(*n*)) = {â(*n*) : there exist positive constants *c*_{1}, *c*_{2}, and *n*_{0 }such that

0 *c*_{1}*g*(*n*) â(*n*) *c*_{2}*g*(*n*) for all *n* *n*_{0}}.

Figure 2.1 (a) gives an intuitive picture of functions â(*n*) and *g*(*n*), where â(*n*) = (*g*(*n*)). For all values of *n* to the right of *n*_{0}, the value of â(*n*) lies at or above *c*_{1}*g*(*n*) and at or below *c*_{2}*g*(*n*). In other words, for all *n* *n*_{0}, the function â(*n*) is equal to *g*(*n*) to within a constant factor. We say that *g*(*n*) is an * asymptotically tight bound* for â(

The definition of (*g*(*n*)) requires that every member of (*g*(*n*)) be * asymptotically nonnegative*, that is, that â(

for all *n* *n*_{0}. Dividing by *n*^{2} yields

The -notation asymptotically bounds a function from above and below. When we have only an * asymptotic upper bound*, we use

*O*(*g*(*n*)) = {â(*n*) : there exist positive constants *c* and *n*_{0} such that

0 â(*n*) *cg*(*n*) for all *n* *n*_{0}}.

Just as *O*-notation provides an asymptotic *upper* bound on a function, -notation provides an **asymptotic lower bound****.** For a given function *g*(*n*), we denote by (*g*(*n*)) the set of functions

(*g*(*n*)) = {*f*(*n*): there exist positive constants *c* and *n*_{0} such that

0 *cg*(*n*) *f* (*n*) for all *n* *n*_{0}} .

For any two functions *f*(*n*) and *g*(*n*)*, f*(*n*)* = **(*g*(*n*)) if and only if *f*(*n*) = *O*(*g*(*n*)) and *f*(*n*)* = (*g*(*n*)).

Since -notation describes a lower bound, when we use it to bound the best-case running time of an algorithm, by implication we also bound the running time of the algorithm on arbitrary inputs as well. For example, the best-case running time of insertion sort is (*n*), which implies that the running time of insertion sort is (*n*).

The running time of insertion sort therefore falls between (*n*) and *O*(*n*^{2}*),* since it falls anywhere between a linear function of *n* and a quadratic function of *n.* Moreover, these bounds are asymptotically as tight as possible: for instance, the running time of insertion sort is not* *(*n*^{2}), since insertion sort runs in (*n*) time when the input is already sorted. It is not contradictory, however, to say that the *worst-case* running time of insertion sort is (*n*^{2}), since there exists an input that causes the algorithm to take (n^{2}) time. When we say that the *running time* (no modifier) of an algorithm is (*g*(*n*)), we mean that *no matter what particular input of size n is chosen for each value of n*, the running time on that set of inputs is at least a constant times *g*(*n*), for sufficiently large *n.*

T(n) = 2T(n/2) + (n).

In some cases, asymptotic notation appears on the left-hand side of an equation, as in

2n^{2}+ (n) = (n^{2}).

A number of such relationships can be chained together, as in

2n^{2}+ 3n+ 1 = 2n^{2}+ (n)

= (n^{2}).

The asymptotic upper bound provided by *O*-notation may or may not be asymptotically tight. The bound 2*n*^{2}* = O*(*n*^{2}) is asymptotically tight, but the bound 2*n = O*(*n*^{2}) is not. We use *o*-notation to denote an upper bound that is not asymptotically tight. We formally define *o*(*g*(*n*)) ("little-oh of *g* of *n*") as the set

*o*(*g*(*n*)) = {(*n*): for any positive constant *c > *0*, *there exists a constant

*n*_{0} > 0 such that 0 *f*(*n*) < *cg*(*n*) for all *n* *n*_{0}}.

For example, 2n = o(n^{2}), but 2n^{2} o(n^{2}).

By analogy, -notation is to -notation as *o*-notation is to *O*-notation. We use -notation to denote a lower bound that is not asymptotically tight. One way to define it is by

(n) (g(n)) if and only ifg(n)o((n)).

Formally, however, we define (*g*(*n*)) ("little-omega of *g* of *n*") as the set

*(*g*(*n*)) = {*(*n*): for any positive constant *c* > 0, there exists a constant

*n*_{0} > 0 such that 0 *cg*(*n*) < (*n*) for all *n* *n*_{0}}.

For example, *n ^{2}*/2 =

Many of the relational properties of real numbers apply to asymptotic comparisons as well. For the following, assume that (*n*) and *g*(*n*) are asymptotically positive.

â(n) = (g(n)) andg(n) = (h(n)) imply â(n) = (h(n)),

â(n) =O(g(n)) andg(n) =O(h(n)) imply â(n) =O(h(n)),

â(n) = (g(n)) andg(n) = (h(n)) imply â(n) = (h(n)),

â(n) =o(g(n)) andg(n) =o(h(n)) imply â(n) =o(h(n)),

â(n) = (g(n)) andg(n) = (h(n)) imply â(n) = (h(n)).

â(n) = (â(n)),

â(n) =O(â(n)),

â(n) = (â(n)),

â(n) = (g(n)) if and only ifg(n) = (â(n)).

(n) =O(g(n)) if and only ifg(n) = (f(n)),

(n) =o(g(n)) if and only ifg(n) = ((n)).

(n) =O(g(n))ab,

(n) = (g(n))ab,

(n) = (g(n))a=b,

(n) =o(g(n))a<b,

(n) = (g(n))a>b.

One property of real numbers, however, does not carry over to asymptotic notation:

Show that for any real constants *a* and *b*, where *b* > 0,

(n+a)= (^{b}n) .^{b}

Explain why the statement, "The running time of algorithm *A* is at least *O*(*n*^{2})," is content-free.

Is 2* ^{n}*+1 =

Prove that *o*(*g*(*n*)) (*g*(*n*)) is the empty set.

*O*(*g*(*n*, *m*)) = {(*n,m*): there exist positive constants *c*, *n*_{0}, and *m*_{0}

Give corresponding definitions for (*g*(*n*, *m*)) and (*g*(*n*, *m*)).

A function (*n*) is * monotonically increasing* if

For any real number *x*, we denote the greatest integer less than or equal to *x* by *x* (read "the floor of *x*") and the least integer greater than or equal to x by *x* (read "the ceiling of *x*"). For all real *x*,

x- 1 <xxx<x+1.

n/2 +n/2=n,

and for any integer *n* and integers *a* 0 and *b* 0,

n/a/b=n/ab

n/a/b=n/ab.

The floor and ceiling functions are monotonically increasing.

Given a positive integer *d*, a **polynomial in n of degree***d* is a function *p*(*n*) of the form

For all real *a* 0, *m*, and *n*, we have the following identities:

a^{0}= 1,

a^{1}=a,

a-^{}^{1}= 1/a,

(a)^{m}=^{n}a,^{mn}

(a)^{m}= (^{n}a)^{n},^{m}

aa^{m}= a^{n}+^{m}n.

from which we can conclude that

n=^{b}0(a).^{n}

Thus, any positive exponential function grows faster than any polynomial.

Using *e *to denote 2.71828 . . ., the base of the natural logarithm function, we have for all real *x*,

e1 +^{x }x,

where equality holds only when x = 0. When |*x*| 1, we have the approximation

1 +xel +^{x}x+x^{2 .}

When *x* 0, the approximation of *e ^{x}* by 1 +

e= 1 +^{x}x+ (x^{2}).

We shall use the following notations:

lgn= log_{2}n(binary logarithm),

lnn= log_{e}n(natural logarithm),

lg^{k}n= (lgn)(exponentiation),^{k}

lg lgn= lg(lgn) (composition).

For all real *a* > 0, *b* > 0, *c* > 0, and *n*,

There is a simple series expansion for ln(1 + *x*) when *x* < 1:

We also have the following inequalities for *x* > -1:

where equality holds only for *x* = 0.

We say that a function *f*(*n*) is **polylogarithmically bounded**** **if *f*(*n*) = lg*O*^{(1)} *n*. We can relate the growth of polynomials and polylogarithms by substituting lg *n* for *n* and 2* ^{a}* for

From this limit, we can conclude that

lg^{b}n=o(n)^{a}

The notation *n*! (read "*n* factorial") is defined for integers *n* 0 as

A weak upper bound on the factorial function is *n*! *n ^{n}*, since each of the

n! =o(n),^{n}

n! = (2),^{n}

lg(n!) = (nlgn).

The following bounds also hold for all *n*:

We use the notation lg* *n* (read "log star of *n*") to denote the iterated logarithm, which is defined as follows. Let the function lg^{(i)} *n* be defined recursively for nonnegative integers *i* as

The iterated logarithm is a *very* slowly growing function:

lg* 2 = 1,

lg* 4 = 2,

lg* 16 = 3,

lg* 65536 = 4,

lg*(2^{65536}) = 5.

The * Fibonacci numbers* are defined by the following recurrence:

F_{0}= 0,

F_{1}= 1,

F=_{i}F-1+_{i}F-2 for_{i}i2.

Thus, each Fibonacci number is the sum of the two previous ones, yielding the sequence

0, 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ... .

Fibonacci numbers are related to the * golden ratio* and to its conjugate , which are given by the following formulas:

Prove that lg(*n*!) = (*n* lg *n*) and that *n*! = *o*(*n ^{n}*).

Is the function [lg *n*]! polynomially bounded? Is the function [lg lg *n*]! polynomially bounded?

Which is asymptotically larger: lg(lg* *n*) or lg*(lg *n*)?

Prove that for *i* 0, the (*i* + 2)nd Fibonacci number satisfies *F _{i}*+2

2-1 Asymptotic behavior of polynomials

**a****.** If *k* *d*, then *p*(*n*) = * O*(

**b****.** If *k* *d*, then *p*(*n*) = (*n ^{k}*).

**c****.** If *k* = *d*, then *p*(*n*) = (*n ^{k}*).

**d****.** If *k* > *d*, then *p*(*n*) = *o*(*n ^{k}*).

**e****.** If *k* < *d*, then *p*(*n*) = *(*n^{k}*).*

2-2 Relative asymptotic growths

2-3 Ordering by asymptotic growth rates

2-4 Asymptotic notation properties

**a.*** f*(*n*)* = O*(*g*(*n*)) implies *g*(*n*)* = O*(*f*(*n*)).

**b.***f*(*n*)* + g*(*n*) = (min(â(*n*), *g*(*n*))).

**d.***f*(*n*)* = O(g(n)) *implies 2^{f(n)} = O(2^{g(n)}).

**f.***f*(*n*)* = O*(*g*(*n*)) implies *g*(*n*) = (*f*(*n*)).

Some authors define in a slightly different way than we do; let's use (read "omega infinity") for this alternative definition. We say that if there exists a positive constant *c* such that *f*(*n*) *cg*(*n*) 0 for infinitely many integers *n*.

Some authors also define *O* in a slightly different manner; let's use *O*'* for the alternative definition. We say that *f*(*n*) = *O*'*(*g*(*n*)) if and only if* *f*(*n*) = O*'*(*g(n*))*.

**c.**** **What happens to each direction of the "if and only if" in Theorem 2.1 under this new definition?

Some authors define (read "soft-oh") to mean *O* with logarithmic factors ignored:

(*g*(*n*)) = {*f*(*n*): there exist positive constants *c, k,* and *n*_{0} such that

0 *f*(*n*) *cg*(*n*)1*g ^{k}*(

* d.* Define in a similar manner. Prove the corresponding analog to Theorem 2.1.

The iteration operator "*" used in the 1g* function can be applied to monotonically increasing functions over the reals. For a function â satisfying *f*(*n*)* < n*, we define the function *f*^{(i)} recursively for nonnegative integers *i* by

For a given constant *c* **R**, we define the iterated function by

For each of the following functions *f *(*n*) and constants *c*, give as tight a bound as possible on (*n*).