Java Reference
In-Depth Information
This is efficient and requires (n) time. However, it also requires two arrays
of size n. Next is a code fragment that places the permutation in order but does so
within the same array (thus it is an example of an “in place” sort).
for(i=0;i<n;i++)
while(A[i]!=i)//SwapelementA[i]withA[A[i]]
DSutil.swap(A,i,A[i]);
Function swap(A,i,j) exchanges elements i and j in array A . It may
not be obvious that the second code fragment actually sorts the array. To see that
this does work, notice that each pass through the for loop will at least move the
integer with value i to its correct position in the array, and that during this iteration,
the value of A[i] must be greater than or equal to i. A total of at most n swap
operations take place, because an integer cannot be moved out of its correct position
once it has been placed there, and each swap operation places at least one integer in
its correct position. Thus, this code fragment has cost (n). However, it requires
more time to run than the first code fragment. On my computer the second version
takes nearly twice as long to run as the first, but it only requires half the space.
A second principle for the relationship between a program's space and time
requirements applies to programs that process information stored on disk, as dis-
cussed in Chapter 8 and thereafter. Strangely enough, the disk-based space/time
tradeoff principle is almost the reverse of the space/time tradeoff principle for pro-
grams using main memory.
The disk-based space/time tradeoff principle states that the smaller you can
make your disk storage requirements, the faster your program will run. This is be-
cause the time to read information from disk is enormous compared to computation
time, so almost any amount of additional computation needed to unpack the data is
going to be less than the disk-reading time saved by reducing the storage require-
ments. Naturally this principle does not hold true in all cases, but it is good to keep
in mind when designing programs that process information stored on disk.
3.10
Speeding Up Your Programs
In practice, there is not such a big difference in running time between an algorithm
with growth rate (n) and another with growth rate (n log n). There is, however,
an enormous difference in running time between algorithms with growth rates of
(n log n) and (n 2 ). As you shall see during the course of your study of common
data structures and algorithms, it is not unusual that a problem whose obvious solu-
tion requires (n 2 ) time also has a solution requiring (n log n) time. Examples
include sorting and searching, two of the most important computer problems.
Example3.18 The following is a true story. A few years ago, one of
my graduate students had a big problem. His thesis work involved several
Search WWH ::




Custom Search