Java Reference
In-Depth Information
Example3.7 Assume T(n) = c 1 n 2 + c 2 n for c 1 and c 2 > 0. Then,
c 1 n 2 + c 2 n c 1 n 2
for all n > 1. So, T(n) cn 2 for c = c 1 and n 0 = 1. Therefore, T(n) is
in (n 2 ) by the definition.
It is also true that the equation of Example 3.7 is in (n). However, as with
big-Oh notation, we wish to get the “tightest” (for notation, the largest) bound
possible. Thus, we prefer to say that this running time is in (n 2 ).
Recall the sequential search algorithm to find a value K within an array of
integers. In the average and worst cases this algorithm is in (n), because in both
the average and worst cases we must examine at least cn values (where c is 1=2 in
the average case and 1 in the worst case).
3.4.3 Notation
The definitions for big-Oh and give us ways to describe the upper bound for an
algorithm (if we can find an equation for the maximum cost of a particular class of
inputs of size n) and the lower bound for an algorithm (if we can find an equation
for the minimum cost for a particular class of inputs of size n). When the upper
and lower bounds are the same within a constant factor, we indicate this by using
(big-Theta) notation. An algorithm is said to be (h(n)) if it is in O(h(n)) and
T(n) is in the set (g(n)) if there exists a positive constantcsuch that T(n)
cg(n) for an infinite number of values forn.
This definition says that for an “interesting” number of cases, the algorithm takes at leastcg(n)
time. Note that this definition is not symmetric with the definition of big-Oh. Forg(n) to be a lower
bound, this definition does not require that T(n)cg(n) for all values ofngreater than some
constant. It only requires that this happen often enough, in particular that it happen for an infinite
number of values forn. Motivation for this alternate definition can be found in the following example.
Assume a particular algorithm has the following behavior:
n for all oddn1
n 2 =100 for all evenn0
From this definition,n 2 =100 1 100 n 2 for all evenn0. So, T(n)cn 2 for an infinite number
of values ofn(i.e., for all evenn) forc= 1=100. Therefore, T(n) is in (n 2 ) by the definition.
For this equation for T(n), it is true that all inputs of sizentake at leastcntime. But an infinite
number of inputs of sizentakecn 2 time, so we would like to say that the algorithm is in (n 2 ).
Unfortunately, using our first definition will yield a lower bound of (n) because it is not possible to
pick constantscandn 0 such that T(n)cn 2 for alln>n 0 . The alternative definition does result
in a lower bound of (n 2 ) for this algorithm, which seems to fit common sense more closely. Fortu-
nately, few real algorithms or computer programs display the pathological behavior of this example.
Our first definition for generally yields the expected result.
As you can see from this discussion, asymptotic bounds notation is not a law of nature. It is merely
a powerful modeling tool used to describe the behavior of algorithms.
T(n) =
 
Search WWH ::




Custom Search