Orders of Growth

The previous examples illustrate that processes can differ
considerably in the rates at which they consume computational
resources. One convenient way to describe this difference is to use
the notion of
*order of growth* to obtain a gross measure of the
resources required by a process as the inputs become larger.

Let *n* be a parameter that measures the size of the problem, and let
*R*(*n*) be the amount of resources the process requires for a problem
of size *n*. In our previous examples we took *n* to be the number
for which a given function is to be computed, but there are other
possibilities. For instance, if our goal is to compute an
approximation to the square root of a number, we might take *n* to be
the number of digits accuracy required. For matrix multiplication we
might take *n* to be the number of rows in the matrices. In general
there are a number of properties of the problem with respect to which
it will be desirable to analyze a given process. Similarly, *R*(*n*)
might measure the number of internal storage registers used, the
number of elementary machine operations performed, and so on. In
computers that do only a fixed number of operations at a time, the
time required will be proportional to the number of elementary machine
operations performed.

We say that *R*(*n*) has order of growth
,
written
(pronounced ``theta of *f*(*n*)''), if there are
positive constants *k*_{1} and *k*_{2} independent of *n* such that

for any sufficiently large value of

For instance, with the linear recursive process for computing
factorial described in section the
number of steps grows proportionally to the input *n*. Thus, the
steps required for this process grows as .
We also saw
that the space required grows as .
For the
iterative
factorial, the number of steps is still
but the space is
--that is, constant.^{} The
tree-recursive Fibonacci computation requires
steps and space ,
where
is the
golden ratio described in section .

Orders of growth provide only a crude description of the behavior of a
process. For example, a process requiring *n*^{2} steps and a process
requiring 1000*n*^{2} steps and a process requiring
3*n*^{2}+10*n*+17 steps
all have
order of growth. On the other hand, order of
growth provides a useful indication of how we may expect the behavior
of the process to change as we change the size of the problem. For a
(linear) process, doubling the size will roughly double the amount
of resources used. For an
exponential process, each increment in
problem size will multiply the resource utilization by a constant
factor. In the remainder of section
we will examine two
algorithms whose order of growth is
logarithmic, so that doubling the
problem size increases the resource requirement by a constant amount.

**Exercise.**
Draw the tree illustrating the process generated by the `
count-change` procedure of section in making
change for 11 cents. What are the orders of growth of the space and
number of steps used by this process as the amount to be changed
increases?

**Exercise.**
The sine of an angle (specified in
radians) can be computed by making use of the approximation
if *x* is
sufficiently small, and the trigonometric identity

to reduce the size of the argument of . (For purposes of this exercise an angle is considered ``sufficiently small'' if its magnitude is not greater than 0.1 radians.) These ideas are incorporated in the following procedures:

(define (cube x) (* x x x)) (define (p x) (- (* 3 x) (* 4 (cube x)))) (define (sine angle) (if (not (> (abs angle) 0.1)) angle (p (sine (/ angle 3.0)))))

aHow many times is the procedure `p`
applied when `(sine 12.15)` is evaluated?

bWhat is the order of growth in space and number of steps (as a
function of *a*) used by the process generated by the `sine`
procedure when `(sine a)` is evaluated?