Java Reference
In-Depth Information
time. The algorithm can do no better than its best-case time. If the best-case time is still too slow,
you need another algorithm.
Now suppose that the algorithm locates the desired value after comparing it to every value in
the array. This would be the algorithm's worst case , since it requires the most time. If you can tol-
erate this worst-case time, your algorithm is acceptable. For many algorithms, the worst and best
cases rarely occur. Thus, we consider an algorithm's average case , when it processes a typical data
set. The average-case time requirement of an algorithm is more useful, but harder to estimate. Note
that the average-case time is not the average of the best-case and worst-case times.
Note: The time requirements of some algorithms depend on the data values given to them.
Those times range from a minimum, or best-case, time to a maximum, or worst-case, time.
Typically, the best and worst cases do not occur. A more useful measure of such an algo-
rithm's time requirement is its average-case time.
Some algorithms, however, do not have a best, worst, and average case. Their time require-
ments depend only on the number of data items given them, not on the values of that data.
Big Oh Notation
4.11
Computer scientists use a notation to represent an algorithm's complexity. For example, consider
the algorithms A, B, and C given in Figure 4-1 and the number of basic operations that each
requires, as shown in Figure 4-2. Instead of saying that Algorithm A has a time requirement propor-
tional to n , we say that A is O( n ) . We call this notation Big Oh since it uses the capital letter O. We
read O( n ) as either “Big Oh of n ” or “order of at most n .” Similarly, since Algorithm B has a time
requirement proportional to n 2 , we say that B is O( n 2 ). Algorithm C always requires three basic
operations. Regardless of the problem size n , this algorithm requires the same time. We say that
Algorithm C is O(1).
4.12
Example. Imagine that you are at a wedding reception, seated at a table of n people. In preparation
for the toast, the waiter pours champagne into each of n glasses. That task is O( n ). Someone makes
a toast. It is O(1), even if the toast seems to last forever, because it is independent of the number of
guests. If you clink your glass with everyone at your table, you perform an O( n ) operation. If every-
one at your table does likewise, a total of O( n 2 ) clinks are performed.
4.13
Big Oh notation has a formal mathematical meaning that can justify our discussion in the previous
sections. You saw that an algorithm's actual time requirement is directly proportional to a function
f of the problem size n . For example, f ( n ) might be n 2 + n + 1. In this case, we would conclude that
the algorithm is of order at most n 2 —that is, O( n 2 ). We essentially have replaced f ( n ) with a sim-
pler function—let's call it g ( n ). In this example, g ( n ) is n 2 .
What does it really mean to say that a function f ( n ) is of order at most g ( n )—that is, f ( n ) is O( g ( n )),
or f ( n ) = O( g ( n ))? Formally, its meaning is described by the following mathematical definition:
Note: Formal definition of Big Oh
A function f ( n ) is of order at most g ( n )—that is, f ( n ) is O( g ( n ))—if
A positive real number c and positive integer N exist such that f ( n )
c x g ( n ) for all n
N . That is, c x g ( n ) is an upper bound on f ( n ) when n is sufficiently large.
 
 
Search WWH ::




Custom Search