Java Reference
In-Depth Information
If someone asked you out of the blue “Who is the best?” your natural reaction
should be to reply “Best at what?” In the same way, if you are asked “What is
the growth rate of this algorithm,” you would need to ask “When? Best case?
Average case? Or worst case?” Some algorithms have the same behavior no matter
which input instance they receive. An example is finding the maximum in an array
of integers. But for many algorithms, it makes a big difference, such as when
searching an unsorted array for a particular value. So any statement about the
upper bound of an algorithm must be in the context of some class of inputs of size
n. We measure this upper bound nearly always on the best-case, average-case, or
worst-case inputs. Thus, we cannot say, “this algorithm has an upper bound to
its growth rate of n 2 .” We must say something like, “this algorithm has an upper
bound to its growth rate of n 2 in the average case.”
Knowing that something is in O(f(n)) says only how bad things can be. Per-
haps things are not nearly so bad. Because sequential search is in O(n) in the worst
case, it is also true to say that sequential search is in O(n 2 ). But sequential search
is practical for large n, in a way that is not true for some other algorithms in O(n 2 ).
We always seek to define the running time of an algorithm with the tightest (low-
est) possible upper bound. Thus, we prefer to say that sequential search is in O(n).
This also explains why the phrase “is in O(f(n))” or the notation “2 O(f(n))” is
used instead of “is O(f(n))” or “= O(f(n)).” There is no strict equality to the use
of big-Oh notation. O(n) is in O(n 2 ), but O(n 2 ) is not in O(n).
3.4.2
Lower Bounds
Big-Oh notation describes an upper bound. In other words, big-Oh notation states
a claim about the greatest amount of some resource (usually time) that is required
by an algorithm for some class of inputs of size n (typically the worst such input,
the average of all possible inputs, or the best such input).
Similar notation is used to describe the least amount of a resource that an alg-
orithm needs for some class of input. Like big-Oh notation, this is a measure of the
algorithm's growth rate. Like big-Oh notation, it works for any resource, but we
most often measure the least amount of time required. And again, like big-Oh no-
tation, we are measuring the resource required for some particular class of inputs:
the worst-, average-, or best-case input of size n.
The lower bound for an algorithm (or a problem, as explained later) is denoted
by the symbol , pronounced “big-Omega” or just “Omega.” The following defi-
nition for is symmetric with the definition of big-Oh.
For T(n) a non-negatively valued function, T(n) is in set (g(n))
if there exist two positive constants c and n 0 such that T(n) cg(n)
for all n > n 0 . 1
1 An alternate (non-equivalent) definition for is
Search WWH ::




Custom Search