Java Reference
In-Depth Information
cost function, then any lower bound for g(n) is also a lower bound for your cost
function. Likewise for notation.
The significance of rule (2) is that you can ignore any multiplicative constants
in your equations when using big-Oh notation. This rule also holds true for and
notations.
Rule (3) says that given two parts of a program run in sequence (whether two
statements or two sections of code), you need consider only the more expensive
part. This rule applies to and notations as well: For both, you need consider
only the more expensive part.
Rule (4) is used to analyze simple loops in programs. If some action is repeated
some number of times, and each repetition has the same cost, then the total cost is
the cost of the action multiplied by the number of times that the action takes place.
This rule applies to and notations as well.
Taking the first three rules collectively, you can ignore all constants and all
lower-order terms to determine the asymptotic growth rate for any cost function.
The advantages and dangers of ignoring constants were discussed near the begin-
ning of this section. Ignoring lower-order terms is reasonable when performing an
asymptotic analysis. The higher-order terms soon swamp the lower-order terms in
their contribution to the total cost as n becomes larger. Thus, if T(n) = 3n 4 + 5n 2 ,
then T(n) is in O(n 4 ). The n 2 term contributes relatively little to the total cost for
large n.
Throughout the rest of this topic, these simplifying rules are used when dis-
cussing the cost for a program or algorithm.
3.4.5
Classifying Functions
Given functions f(n) and g(n) whose growth rates are expressed as algebraic equa-
tions, we might like to determine if one grows faster than the other. The best way
to do this is to take the limit of the two functions as n grows towards infinity,
f(n)
g(n) :
lim
n!1
If the limit goes to 1, then f(n) is in (g(n)) because f(n) grows faster. If the
limit goes to zero, then f(n) is in O(g(n)) because g(n) grows faster. If the limit
goes to some constant other than zero, then f(n) = (g(n)) because both grow at
the same rate.
Example3.8 If f(n) = 2n log n and g(n) = n 2 , is f(n) in O(g(n)),
(g(n)), or (g(n))? Because
n 2
2n log n =
n
2 log n ;
Search WWH ::




Custom Search