Java Reference
In-Depth Information
FIGURE 4-9
The effect of doubling the problem size on an algorithm's time
requirement
Growth-Rate Function
for Size n Problems
1
Growth-Rate Function
for Size 2 n Problems
Effect on Time
Requirement
1
None
log n
1 +
log n
Negligible
n
2 n
Doubles
n log n
2 n log n + 2 n
Doubles and then adds 2 n
n 2
(2 n ) 2
Quadruples
n 3
(2 n ) 3
Multiplies by 8
2 n
2 2 n
Squares
4.22
Now suppose that your computer can perform one million operations per second. How long will it
take an algorithm to solve a problem whose size is one million? We cannot answer this question
exactly without knowing the algorithm, but the computations in Figure 4-10 will give you a sense
of how the algorithm's Big Oh would affect our answer. An O(log n ) algorithm would take a frac-
tion of a second, whereas an O( 2 n ) algorithm would take many trillions of years! Note that these
computations estimate the time requirement of an O( g ( n )) algorithm as g ( n ). Although this approx-
imation is not universally valid, for many algorithms it is reasonable.
FIGURE 4-10
The time required to process one million items by algorithms of
various orders at the rate of one million operations per second
Growth-Rate
Function g
g (10 6 ) / 10 6
log n
0.0000199 seconds
n
1 second
n log n
19.9 seconds
n 2
11.6 days
n 3
31,709.8 years
2 n
10 301,016 years
Note: You can use O( n 2 ), O( n 3 ), or even O(2 n ) algorithms as long as your problem size is
small. For example, at the rate of one million operations per second, an O( n 2 ) algorithm
would take one second to solve a problem whose size is 1000. An O( n 3 ) algorithm would
take one second to solve a problem whose size is 100. And an O(2 n ) algorithm would take
about one second to solve a problem whose size is 20.
Search WWH ::




Custom Search